Mar 20 13:21:39 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:21:39 crc restorecon[4699]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:39 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:40 crc restorecon[4699]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:21:40 crc kubenswrapper[4895]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.935171 4895 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944573 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944638 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944650 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944662 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944675 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944684 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944693 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944702 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944711 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944720 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944729 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944737 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944745 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944752 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944763 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944773 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944782 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944790 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944799 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944807 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944816 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944824 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944835 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944844 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944853 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944862 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944869 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944891 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944899 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944906 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944915 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944923 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944931 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944939 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944949 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944957 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944965 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944973 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944980 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944989 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.944997 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945005 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945014 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945022 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945029 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945037 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945045 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945052 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945060 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945068 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945077 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945085 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945093 4895 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945100 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945108 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945116 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945126 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945136 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945144 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945155 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945167 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945180 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945191 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945201 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945211 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945220 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945230 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945240 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945252 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945260 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.945268 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946497 4895 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946526 4895 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946545 4895 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946557 4895 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946570 4895 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946580 4895 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946593 4895 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946605 4895 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946615 4895 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946625 4895 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946634 4895 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946645 4895 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946655 4895 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946665 4895 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946673 4895 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946683 4895 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946691 4895 flags.go:64] FLAG: --cloud-config="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946701 4895 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946709 4895 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946720 4895 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946729 4895 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946739 4895 flags.go:64] FLAG: --config-dir="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946748 4895 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946758 4895 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946770 4895 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946779 4895 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946788 4895 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946798 4895 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946808 4895 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946817 4895 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946827 4895 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946836 4895 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946845 4895 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946857 4895 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946867 4895 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946876 4895 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946885 4895 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946894 4895 flags.go:64] FLAG: --enable-server="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946904 4895 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946920 4895 flags.go:64] FLAG: --event-burst="100" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946930 4895 flags.go:64] FLAG: --event-qps="50" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946939 4895 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946948 4895 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946957 4895 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946968 4895 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946978 4895 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946987 4895 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.946998 4895 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947008 4895 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947017 4895 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947026 4895 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947035 4895 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947044 4895 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947054 4895 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947063 4895 flags.go:64] FLAG: --feature-gates="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947074 4895 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947084 4895 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947093 4895 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947103 4895 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947113 4895 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947122 4895 flags.go:64] FLAG: --help="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947131 4895 flags.go:64] FLAG: --hostname-override="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947140 4895 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947149 4895 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947159 4895 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947168 4895 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947177 4895 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947190 4895 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947199 4895 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947209 4895 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947218 4895 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947227 4895 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947237 4895 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947246 4895 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947255 4895 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947264 4895 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947273 4895 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947282 4895 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947291 4895 flags.go:64] FLAG: --lock-file="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947300 4895 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947310 4895 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947320 4895 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947345 4895 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947355 4895 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947365 4895 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947375 4895 flags.go:64] FLAG: --logging-format="text" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947384 4895 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947422 4895 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947431 4895 flags.go:64] FLAG: --manifest-url="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947441 4895 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947454 4895 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947463 4895 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947475 4895 flags.go:64] FLAG: --max-pods="110" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947484 4895 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947493 4895 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947503 4895 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947512 4895 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947522 4895 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947532 4895 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947544 4895 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947567 4895 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947576 4895 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947586 4895 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947596 4895 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947605 4895 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947619 4895 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947627 4895 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947637 4895 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947646 4895 flags.go:64] FLAG: --port="10250" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947655 4895 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947664 4895 flags.go:64] FLAG: --provider-id="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947673 4895 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947683 4895 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947692 4895 flags.go:64] FLAG: --register-node="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947701 4895 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947711 4895 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947726 4895 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947735 4895 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947745 4895 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947755 4895 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947796 4895 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947806 4895 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947816 4895 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947825 4895 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947834 4895 flags.go:64] FLAG: --runonce="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947843 4895 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947852 4895 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947863 4895 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947873 4895 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947885 4895 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947897 4895 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947909 4895 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947922 4895 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947933 4895 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947945 4895 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947956 4895 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947968 4895 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947980 4895 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.947992 4895 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948004 4895 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948022 4895 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948034 4895 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948046 4895 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948060 4895 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948071 4895 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948080 4895 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948089 4895 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948099 4895 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948108 4895 flags.go:64] FLAG: --v="2" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948120 4895 flags.go:64] FLAG: --version="false" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948132 4895 flags.go:64] FLAG: --vmodule="" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948144 4895 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.948154 4895 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948383 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948425 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948447 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948456 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948465 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948473 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948483 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948491 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948499 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948507 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948515 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948524 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948532 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948543 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948553 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948562 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948570 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948581 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948591 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948600 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948610 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948618 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948627 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948637 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948645 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948653 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948660 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948669 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948677 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948685 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948696 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948706 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948715 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948724 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948733 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948743 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948753 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948762 4895 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948772 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948782 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948791 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948799 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948806 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948814 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948822 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948830 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948838 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948845 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948853 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948860 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948868 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948876 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948883 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948892 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948899 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948907 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948914 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948924 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948933 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948942 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948949 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948957 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948965 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948974 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948981 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948989 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.948997 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.949005 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.949013 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.949021 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.949029 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.949056 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.962199 4895 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.962244 4895 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962365 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962382 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962410 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962416 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962422 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962427 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962432 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962437 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962442 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962446 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962450 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962455 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962460 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962467 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962473 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962478 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962484 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962491 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962496 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962502 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962508 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962513 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962519 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962526 4895 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962531 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962537 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962542 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962547 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962553 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962560 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962564 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962569 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962574 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962579 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962584 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962589 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962593 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962598 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962616 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962621 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962625 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962630 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962637 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962644 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962649 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962653 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962658 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962667 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962672 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962677 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962681 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962687 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962691 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962696 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962700 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962705 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962709 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962714 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962719 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962724 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962728 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962733 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962738 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962742 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962747 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962754 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962759 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962764 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962769 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962774 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962778 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.962787 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962951 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962959 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962966 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962972 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962978 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962983 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962990 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.962998 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963004 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963008 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963013 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963018 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963022 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963027 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963031 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963036 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963040 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963045 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963049 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963054 4895 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963059 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963065 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963071 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963076 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963081 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963086 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963090 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963095 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963101 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963108 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963113 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963118 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963123 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963128 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963132 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963137 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963141 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963146 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963152 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963160 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963165 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963170 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963175 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963179 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963184 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963188 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963193 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963198 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963202 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963207 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963212 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963216 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963221 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963225 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963230 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963234 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963239 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963243 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963247 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963252 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963256 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963261 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963266 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963271 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963276 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963282 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963287 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963292 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963297 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963301 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:40 crc kubenswrapper[4895]: W0320 13:21:40.963307 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.963316 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.963571 4895 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:21:40 crc kubenswrapper[4895]: E0320 13:21:40.967880 4895 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.973359 4895 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.973606 4895 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.975891 4895 server.go:997] "Starting client certificate rotation" Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.975949 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:21:40 crc kubenswrapper[4895]: I0320 13:21:40.976199 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.009077 4895 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.011686 4895 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.012612 4895 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.032072 4895 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.075880 4895 log.go:25] "Validated CRI v1 image API" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.078995 4895 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.088082 4895 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-16-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.088139 4895 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.116058 4895 manager.go:217] Machine: {Timestamp:2026-03-20 13:21:41.113018404 +0000 UTC m=+0.622737380 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2d141aca-ef91-4eca-959b-e9b486ead362 BootID:cedb54ff-0ea2-432e-bafc-4f3a8bf58c53 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ad:49:0f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ad:49:0f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1e:c4:d4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:99:cb:50 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d2:ea:65 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8a:af:c6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:66:5a:dc:6a:fe:34 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:6e:68:b1:44:2c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.116346 4895 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.116527 4895 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.116894 4895 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.117103 4895 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.117145 4895 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.117385 4895 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.117418 4895 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.117988 4895 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.118024 4895 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.118763 4895 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.118890 4895 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.122620 4895 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.122651 4895 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.122750 4895 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.122769 4895 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.122786 4895 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.127137 4895 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.128208 4895 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.129449 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.129454 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.129557 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.129576 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.131034 4895 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132775 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132806 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132817 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132827 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132842 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132853 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132886 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132917 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132930 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132940 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132959 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.132969 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.133765 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.134333 4895 server.go:1280] "Started kubelet" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.134508 4895 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.134615 4895 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.135788 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.136126 4895 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:21:41 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.142023 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.142075 4895 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.142280 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.142641 4895 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.142669 4895 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.142754 4895 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.144427 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.145728 4895 factory.go:55] Registering systemd factory Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.145843 4895 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.147339 4895 factory.go:153] Registering CRI-O factory Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.147469 4895 factory.go:221] Registration of the crio container factory successfully Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.147646 4895 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.147745 4895 factory.go:103] Registering Raw factory Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.147836 4895 manager.go:1196] Started watching for new ooms in manager Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.147566 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.148022 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.147738 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8f559f11dd6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,LastTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.149205 4895 manager.go:319] Starting recovery of all containers Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.149327 4895 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160754 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160817 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160840 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160857 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160873 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160887 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160902 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160918 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160936 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160950 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160966 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160981 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.160995 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161012 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161026 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161041 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161059 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161073 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161087 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161102 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161116 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161132 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161147 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161167 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161182 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161200 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161219 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161235 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161254 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161269 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161284 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161299 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161318 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161334 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161351 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161366 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161380 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161445 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161466 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161483 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161499 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161514 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161532 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161547 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161561 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161579 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161595 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161611 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161628 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161642 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161657 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161671 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161693 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161710 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161727 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161744 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161761 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161778 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161806 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161822 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161839 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161855 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161870 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161889 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161907 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161924 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161942 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161957 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161972 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.161988 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162005 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162021 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162036 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162053 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162068 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162083 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162100 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162116 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162132 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162148 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162175 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162192 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162210 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162593 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162627 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.162648 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163476 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163521 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163611 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163643 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163666 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163699 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163738 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163771 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163793 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163815 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163849 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163871 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163901 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163921 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163944 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163975 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.163995 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.164025 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.168605 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.168674 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169013 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169049 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169079 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169107 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169127 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169155 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169184 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169231 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169469 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169775 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169864 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169913 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.169947 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170007 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170042 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170092 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170132 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170186 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170258 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170291 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170337 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170367 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170427 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170470 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170568 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170611 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170646 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170676 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170718 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170778 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170816 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170843 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170871 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170912 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.170940 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171028 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171103 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171195 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171235 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171263 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171297 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171351 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171381 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171466 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171602 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171769 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171799 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171842 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.171900 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172032 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172106 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172264 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172606 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172726 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172787 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172827 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172889 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172950 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.172986 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173010 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173030 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173099 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173121 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173147 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173166 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173217 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173243 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173263 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173287 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173321 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173340 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173364 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173457 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173527 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173571 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173601 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.173784 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174417 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174562 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174641 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174665 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174684 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174923 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174950 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174964 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.174986 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175002 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175015 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175032 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175046 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175062 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175075 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175088 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175105 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.175119 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.179837 4895 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.179934 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.179969 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.179990 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.180026 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.180049 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.180077 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.180098 4895 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.180112 4895 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.184940 4895 manager.go:324] Recovery completed Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.205543 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.207417 4895 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.208220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.208288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.208306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210241 4895 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210290 4895 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210334 4895 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210334 4895 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210406 4895 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.210436 4895 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.210534 4895 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.212088 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.212155 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.234227 4895 policy_none.go:49] "None policy: Start" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.235411 4895 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.235458 4895 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.243574 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.289700 4895 manager.go:334] "Starting Device Plugin manager" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.289796 4895 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.289819 4895 server.go:79] "Starting device plugin registration server" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.290577 4895 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.290669 4895 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.290876 4895 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.291079 4895 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.291103 4895 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.303183 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.311159 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.311257 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.318588 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.318642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.318653 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.318809 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.319238 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.319320 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.319887 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.319960 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.319971 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320161 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320550 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320689 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320823 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320845 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320854 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.320949 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.321341 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.321524 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.321613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.321740 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.321791 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.322015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.322070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.322101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.322673 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.323007 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.323050 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.323504 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.323562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.323739 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324448 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324504 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324534 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.324991 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.325010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.325020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.325221 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.325261 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.326716 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.326766 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.326786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.345565 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382409 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382466 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382533 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382709 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382783 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382866 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382934 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.382978 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.383031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.383115 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.383215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.391047 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.394156 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.394227 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.394247 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.394286 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.394875 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484500 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484546 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484565 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484585 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484607 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484651 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484675 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484731 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484863 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484763 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484898 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484915 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484917 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484963 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.484972 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485015 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485055 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485092 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485128 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485384 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.485451 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.595712 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.597700 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.597751 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.597767 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.597797 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.598497 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.653791 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.679838 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.695104 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.702681 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4fdc8cb9039e62006620d2133747ad28a0851f57c6007e7c159f47b38e2cf214 WatchSource:0}: Error finding container 4fdc8cb9039e62006620d2133747ad28a0851f57c6007e7c159f47b38e2cf214: Status 404 returned error can't find the container with id 4fdc8cb9039e62006620d2133747ad28a0851f57c6007e7c159f47b38e2cf214 Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.714098 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3b1790f596e1fbc652e23e80226a946c4e3652201ace7e1722caa40cfdae52e3 WatchSource:0}: Error finding container 3b1790f596e1fbc652e23e80226a946c4e3652201ace7e1722caa40cfdae52e3: Status 404 returned error can't find the container with id 3b1790f596e1fbc652e23e80226a946c4e3652201ace7e1722caa40cfdae52e3 Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.715018 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e762a13cc71219ccac3070aec5d3aaa89cabaf3e5841dbaf3295957c06ce119c WatchSource:0}: Error finding container e762a13cc71219ccac3070aec5d3aaa89cabaf3e5841dbaf3295957c06ce119c: Status 404 returned error can't find the container with id e762a13cc71219ccac3070aec5d3aaa89cabaf3e5841dbaf3295957c06ce119c Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.719561 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.727320 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.744256 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0d43b0c8d244a191856f96aa3cd141a08c64c20c6694fc77b7c3256c06724afd WatchSource:0}: Error finding container 0d43b0c8d244a191856f96aa3cd141a08c64c20c6694fc77b7c3256c06724afd: Status 404 returned error can't find the container with id 0d43b0c8d244a191856f96aa3cd141a08c64c20c6694fc77b7c3256c06724afd Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.746288 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.747005 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-375febc246abf8c8bccdc59c59052c78374e169bc5451cfe8f521d071e4c1a4d WatchSource:0}: Error finding container 375febc246abf8c8bccdc59c59052c78374e169bc5451cfe8f521d071e4c1a4d: Status 404 returned error can't find the container with id 375febc246abf8c8bccdc59c59052c78374e169bc5451cfe8f521d071e4c1a4d Mar 20 13:21:41 crc kubenswrapper[4895]: W0320 13:21:41.955256 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:41 crc kubenswrapper[4895]: E0320 13:21:41.955456 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:41 crc kubenswrapper[4895]: I0320 13:21:41.999110 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.000725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.000770 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.000782 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.000816 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.001295 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.136792 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:42 crc kubenswrapper[4895]: W0320 13:21:42.171192 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.171361 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.216157 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4fdc8cb9039e62006620d2133747ad28a0851f57c6007e7c159f47b38e2cf214"} Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.217295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"375febc246abf8c8bccdc59c59052c78374e169bc5451cfe8f521d071e4c1a4d"} Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.218500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0d43b0c8d244a191856f96aa3cd141a08c64c20c6694fc77b7c3256c06724afd"} Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.219770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3b1790f596e1fbc652e23e80226a946c4e3652201ace7e1722caa40cfdae52e3"} Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.221169 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e762a13cc71219ccac3070aec5d3aaa89cabaf3e5841dbaf3295957c06ce119c"} Mar 20 13:21:42 crc kubenswrapper[4895]: W0320 13:21:42.498648 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.499081 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:42 crc kubenswrapper[4895]: W0320 13:21:42.533119 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.533194 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.547172 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.801461 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.804181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.804245 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.804257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:42 crc kubenswrapper[4895]: I0320 13:21:42.804291 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:42 crc kubenswrapper[4895]: E0320 13:21:42.804989 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.127451 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:43 crc kubenswrapper[4895]: E0320 13:21:43.128571 4895 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.136508 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.226538 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c048c94d3af607e47d88548a0a8d6642d60ec5fa594f7c08afde62a20cf64b6f" exitCode=0 Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.226605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c048c94d3af607e47d88548a0a8d6642d60ec5fa594f7c08afde62a20cf64b6f"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.226766 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.228489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.228542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.228561 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.229772 4895 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="08c0550c86a3787739b827737235cdcd006bd136fe55b8d1519c39815c168881" exitCode=0 Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.229820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"08c0550c86a3787739b827737235cdcd006bd136fe55b8d1519c39815c168881"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.229865 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.230850 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.230877 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.230886 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.231782 4895 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4ddefb8d858a984eb301cd969bd7c31a25939d18b487b8dd3e60b783230f7053" exitCode=0 Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.231874 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4ddefb8d858a984eb301cd969bd7c31a25939d18b487b8dd3e60b783230f7053"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.231966 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.233046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.233092 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.233112 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.235788 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5" exitCode=0 Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.235883 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.235906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.236875 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.236903 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.236912 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.240257 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"88cd703000786eb37297ae28517d640b80c251621377c7aee65173b5438d3243"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.240442 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.240597 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bbb6b9e925f953219bc8a6e03d51655c92022468746f63390799220d3c12da42"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.240732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f6947eccf2e6472e6aadd8de5a75013a05892dd37296d7c39c4953ee9d228fdb"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.240859 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae27d8e0648506efab87e9191454765762f4bf5387968c9227cd1717d72ad478"} Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.242042 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.242067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.242077 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.242771 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.243543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.243572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:43 crc kubenswrapper[4895]: I0320 13:21:43.243580 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:43 crc kubenswrapper[4895]: W0320 13:21:43.914677 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:43 crc kubenswrapper[4895]: E0320 13:21:43.914824 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.136637 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:44 crc kubenswrapper[4895]: E0320 13:21:44.148803 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="3.2s" Mar 20 13:21:44 crc kubenswrapper[4895]: W0320 13:21:44.193350 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:44 crc kubenswrapper[4895]: E0320 13:21:44.193458 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:44 crc kubenswrapper[4895]: W0320 13:21:44.207137 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Mar 20 13:21:44 crc kubenswrapper[4895]: E0320 13:21:44.207215 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.245304 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"643b1e6cb16d695ff49194cbc022facd623f374d891b4a7e7c035bd86b27544c"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.245358 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"93c541771b375c66b14a13d7c11d51b5083ae81e38e4b458de6048e86d9f48b9"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.245367 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82ccc44143bbe48a9c92634eddb8bd194a61c66e9634c6619ddcea070483a132"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.245486 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.246413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.246433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.246441 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.252923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.252970 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.252980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.252989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.254430 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f783f11682af515f2e62685beb5ddf24bb3cab9fe758694630ebf6bdea87a834" exitCode=0 Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.254484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f783f11682af515f2e62685beb5ddf24bb3cab9fe758694630ebf6bdea87a834"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.254621 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.255607 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.255634 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.255643 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.258702 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.259184 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.259188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"12685bf7bf0a3354f3b62bc075ddc80fce18bc63fd64a0371c7d73eedb2f3b60"} Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.259960 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.259994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.260006 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.260641 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.260707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.260774 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.405958 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.410148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.410221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.410239 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:44 crc kubenswrapper[4895]: I0320 13:21:44.410284 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:44 crc kubenswrapper[4895]: E0320 13:21:44.410921 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.266922 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="820def28d9e3d4c716c8db4be674e40acc308683cd0bf510f830f3d8ec5f9c09" exitCode=0 Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.267146 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"820def28d9e3d4c716c8db4be674e40acc308683cd0bf510f830f3d8ec5f9c09"} Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.267170 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.268806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.268863 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.268884 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.278275 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe43d4cbc9693b6f0e2c79424a36ab43d0bb4d704bc4b698e36b1b6788d2120e"} Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.278635 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.278451 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.278557 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.278362 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.280796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.280879 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.280898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281225 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281248 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281308 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.281430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.444127 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.572688 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.573064 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.574812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.574865 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4895]: I0320 13:21:45.574885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289154 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2a2b6acdf4ccecd21a5245e3f618b65135a4c2df76bd364a9ec786e91b906012"} Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289228 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"982222c341763a626bcfd9f38a24b5e29f103542ecb4f28213915e57fd17ac05"} Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289257 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9641c877e0b4dbc9af3904bb5aef0d08b84a994addd4e24c3ba900f01cae94c5"} Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d44b79d49a41e4d3922005161205e96ba319b48da9d9d8108d878df7172dd623"} Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289302 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289425 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.289516 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290785 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290826 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290828 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290856 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.290845 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:46 crc kubenswrapper[4895]: I0320 13:21:46.598765 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.300447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c204748046dbce0b963d7a1fe94b54722f8995482981ade9350f4ba3d9021be"} Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.300519 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.300566 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.300578 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302512 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302586 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302592 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302608 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.302620 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.353925 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.611949 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.613831 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.613895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.613913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.613949 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:47 crc kubenswrapper[4895]: I0320 13:21:47.807139 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.304922 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.304964 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.307778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.307882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.307910 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.308117 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.308248 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.308276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.319462 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.319724 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.321262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.321317 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.321332 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.331060 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:48 crc kubenswrapper[4895]: I0320 13:21:48.563378 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.308171 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.308171 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.308320 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.308386 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310747 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310773 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310816 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310846 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.310869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.311656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.311839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.311986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:49 crc kubenswrapper[4895]: I0320 13:21:49.478782 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:50 crc kubenswrapper[4895]: I0320 13:21:50.311311 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:50 crc kubenswrapper[4895]: I0320 13:21:50.312264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:50 crc kubenswrapper[4895]: I0320 13:21:50.312298 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:50 crc kubenswrapper[4895]: I0320 13:21:50.312309 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:51 crc kubenswrapper[4895]: E0320 13:21:51.304009 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.313951 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.314870 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.314920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.314937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.708517 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.708796 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.710575 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.710645 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:51 crc kubenswrapper[4895]: I0320 13:21:51.710670 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.478821 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.478926 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.533042 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.533205 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.534554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.534594 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:52 crc kubenswrapper[4895]: I0320 13:21:52.534607 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.137991 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.333641 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.336054 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe43d4cbc9693b6f0e2c79424a36ab43d0bb4d704bc4b698e36b1b6788d2120e" exitCode=255 Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.336136 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fe43d4cbc9693b6f0e2c79424a36ab43d0bb4d704bc4b698e36b1b6788d2120e"} Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.336437 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.337603 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.337653 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.337670 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.338458 4895 scope.go:117] "RemoveContainer" containerID="fe43d4cbc9693b6f0e2c79424a36ab43d0bb4d704bc4b698e36b1b6788d2120e" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.444564 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.444661 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.531605 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:21:55 crc kubenswrapper[4895]: I0320 13:21:55.531665 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.547161 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f559f11dd6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,LastTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.551300 4895 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:55 crc kubenswrapper[4895]: W0320 13:21:55.553448 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.553522 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.554140 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:21:55 crc kubenswrapper[4895]: W0320 13:21:55.555812 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.555854 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:55 crc kubenswrapper[4895]: W0320 13:21:55.558295 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.558343 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:55 crc kubenswrapper[4895]: W0320 13:21:55.561956 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.562017 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:55 crc kubenswrapper[4895]: E0320 13:21:55.567064 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:55Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.140246 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:56Z is after 2026-02-23T05:33:13Z Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.340788 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.343457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468"} Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.343672 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.345039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.345078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:56 crc kubenswrapper[4895]: I0320 13:21:56.345089 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.142576 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:57Z is after 2026-02-23T05:33:13Z Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.348838 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.349879 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.352440 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" exitCode=255 Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.352513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468"} Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.352654 4895 scope.go:117] "RemoveContainer" containerID="fe43d4cbc9693b6f0e2c79424a36ab43d0bb4d704bc4b698e36b1b6788d2120e" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.352796 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.359246 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.359308 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.359325 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.360248 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:21:57 crc kubenswrapper[4895]: E0320 13:21:57.360547 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:57 crc kubenswrapper[4895]: I0320 13:21:57.807176 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.141085 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:58Z is after 2026-02-23T05:33:13Z Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.358791 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.361898 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.363215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.363263 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.363284 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.364173 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:21:58 crc kubenswrapper[4895]: E0320 13:21:58.364559 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.599897 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.600177 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.601675 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.601765 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.601789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:58 crc kubenswrapper[4895]: I0320 13:21:58.620800 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:21:59 crc kubenswrapper[4895]: I0320 13:21:59.141668 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:59Z is after 2026-02-23T05:33:13Z Mar 20 13:21:59 crc kubenswrapper[4895]: I0320 13:21:59.364981 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:59 crc kubenswrapper[4895]: I0320 13:21:59.366894 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:59 crc kubenswrapper[4895]: I0320 13:21:59.366964 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:59 crc kubenswrapper[4895]: I0320 13:21:59.366983 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.141841 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:00Z is after 2026-02-23T05:33:13Z Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.450907 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.451168 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.452951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.452995 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.453039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.453684 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:22:00 crc kubenswrapper[4895]: E0320 13:22:00.453920 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:00 crc kubenswrapper[4895]: I0320 13:22:00.456121 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.142812 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:01Z is after 2026-02-23T05:33:13Z Mar 20 13:22:01 crc kubenswrapper[4895]: E0320 13:22:01.304280 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.369372 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.370615 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.370651 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.370660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.371268 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:22:01 crc kubenswrapper[4895]: E0320 13:22:01.371471 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:01 crc kubenswrapper[4895]: W0320 13:22:01.477333 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:01Z is after 2026-02-23T05:33:13Z Mar 20 13:22:01 crc kubenswrapper[4895]: E0320 13:22:01.477589 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.954583 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.956220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.956270 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.956289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:01 crc kubenswrapper[4895]: I0320 13:22:01.956321 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:01 crc kubenswrapper[4895]: E0320 13:22:01.964689 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:01 crc kubenswrapper[4895]: E0320 13:22:01.973932 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:02 crc kubenswrapper[4895]: I0320 13:22:02.142492 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:02 crc kubenswrapper[4895]: I0320 13:22:02.480322 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:22:02 crc kubenswrapper[4895]: I0320 13:22:02.480501 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.143218 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.177724 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.177984 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.179673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.179729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.179749 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:03 crc kubenswrapper[4895]: I0320 13:22:03.180653 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:22:03 crc kubenswrapper[4895]: E0320 13:22:03.180934 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:04 crc kubenswrapper[4895]: I0320 13:22:04.144586 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:04 crc kubenswrapper[4895]: I0320 13:22:04.306338 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:22:04 crc kubenswrapper[4895]: I0320 13:22:04.330632 4895 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:22:04 crc kubenswrapper[4895]: W0320 13:22:04.818592 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:22:04 crc kubenswrapper[4895]: E0320 13:22:04.818697 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:05 crc kubenswrapper[4895]: W0320 13:22:05.001673 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.001756 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:05 crc kubenswrapper[4895]: I0320 13:22:05.140587 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.553428 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f559f11dd6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,LastTimestamp:2026-03-20 13:21:41.134294379 +0000 UTC m=+0.644013365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.559348 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.566644 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.574264 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.579768 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a88b9fd2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.293268946 +0000 UTC m=+0.802987942,LastTimestamp:2026-03-20 13:21:41.293268946 +0000 UTC m=+0.802987942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.587314 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.318626429 +0000 UTC m=+0.828345395,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.592905 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.318648889 +0000 UTC m=+0.828367855,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.600121 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.318658039 +0000 UTC m=+0.828377005,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.606925 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.319917971 +0000 UTC m=+0.829636927,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.614379 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.319968441 +0000 UTC m=+0.829687397,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.621004 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.319976101 +0000 UTC m=+0.829695067,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.626333 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.320840472 +0000 UTC m=+0.830559428,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.633072 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.320851682 +0000 UTC m=+0.830570648,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.640736 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.320860082 +0000 UTC m=+0.830579048,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.646739 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.321681084 +0000 UTC m=+0.831400080,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.652304 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.321768024 +0000 UTC m=+0.831487040,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.653519 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.321812754 +0000 UTC m=+0.831531750,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.657996 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.322050204 +0000 UTC m=+0.831769200,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.662776 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.322088324 +0000 UTC m=+0.831807330,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.669880 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.322111624 +0000 UTC m=+0.831830620,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.674961 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.323543997 +0000 UTC m=+0.833262973,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.681266 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.323584667 +0000 UTC m=+0.833303643,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.687762 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b565c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b565c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208315484 +0000 UTC m=+0.718034490,LastTimestamp:2026-03-20 13:21:41.323754817 +0000 UTC m=+0.833473793,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.694232 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37a9b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37a9b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208267543 +0000 UTC m=+0.717986549,LastTimestamp:2026-03-20 13:21:41.324474508 +0000 UTC m=+0.834193484,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.698668 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f55a37b1ac0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f55a37b1ac0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.208300224 +0000 UTC m=+0.718019220,LastTimestamp:2026-03-20 13:21:41.324498018 +0000 UTC m=+0.834217004,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.705151 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f55c1a276fe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.714196222 +0000 UTC m=+1.223915198,LastTimestamp:2026-03-20 13:21:41.714196222 +0000 UTC m=+1.223915198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.711236 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55c1c8ccea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.716708586 +0000 UTC m=+1.226427552,LastTimestamp:2026-03-20 13:21:41.716708586 +0000 UTC m=+1.226427552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.716023 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f55c219d1b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.722018225 +0000 UTC m=+1.231737201,LastTimestamp:2026-03-20 13:21:41.722018225 +0000 UTC m=+1.231737201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.721039 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f55c398077b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.747066747 +0000 UTC m=+1.256785733,LastTimestamp:2026-03-20 13:21:41.747066747 +0000 UTC m=+1.256785733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.725698 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f55c3c41870 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:41.749954672 +0000 UTC m=+1.259673638,LastTimestamp:2026-03-20 13:21:41.749954672 +0000 UTC m=+1.259673638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.731131 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f55e7f45c4d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.357097549 +0000 UTC m=+1.866816525,LastTimestamp:2026-03-20 13:21:42.357097549 +0000 UTC m=+1.866816525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.737508 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55e8148b07 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.359206663 +0000 UTC m=+1.868925639,LastTimestamp:2026-03-20 13:21:42.359206663 +0000 UTC m=+1.868925639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.743339 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f55e81727d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.359377874 +0000 UTC m=+1.869096880,LastTimestamp:2026-03-20 13:21:42.359377874 +0000 UTC m=+1.869096880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.749258 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f55e8239d47 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.360194375 +0000 UTC m=+1.869913341,LastTimestamp:2026-03-20 13:21:42.360194375 +0000 UTC m=+1.869913341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.755129 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f55e82de794 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.360868756 +0000 UTC m=+1.870587732,LastTimestamp:2026-03-20 13:21:42.360868756 +0000 UTC m=+1.870587732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.759195 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f55e8c76d03 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.370929923 +0000 UTC m=+1.880648899,LastTimestamp:2026-03-20 13:21:42.370929923 +0000 UTC m=+1.880648899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.764286 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55e8eaa1cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.373237197 +0000 UTC m=+1.882956173,LastTimestamp:2026-03-20 13:21:42.373237197 +0000 UTC m=+1.882956173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.771570 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55e90d0b27 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.375492391 +0000 UTC m=+1.885211447,LastTimestamp:2026-03-20 13:21:42.375492391 +0000 UTC m=+1.885211447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.778451 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f55e913a35b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.375924571 +0000 UTC m=+1.885643547,LastTimestamp:2026-03-20 13:21:42.375924571 +0000 UTC m=+1.885643547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.785683 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f55e9176d20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.376172832 +0000 UTC m=+1.885891808,LastTimestamp:2026-03-20 13:21:42.376172832 +0000 UTC m=+1.885891808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.792607 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f55e95a1ccb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.380543179 +0000 UTC m=+1.890262185,LastTimestamp:2026-03-20 13:21:42.380543179 +0000 UTC m=+1.890262185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.797561 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55ff06567f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.744151679 +0000 UTC m=+2.253870675,LastTimestamp:2026-03-20 13:21:42.744151679 +0000 UTC m=+2.253870675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.802996 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55ffd5fe1e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.757760542 +0000 UTC m=+2.267479548,LastTimestamp:2026-03-20 13:21:42.757760542 +0000 UTC m=+2.267479548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.809545 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55fff97815 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.760085525 +0000 UTC m=+2.269804541,LastTimestamp:2026-03-20 13:21:42.760085525 +0000 UTC m=+2.269804541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.815642 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f560d78fc7a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.986546298 +0000 UTC m=+2.496265294,LastTimestamp:2026-03-20 13:21:42.986546298 +0000 UTC m=+2.496265294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.821283 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f560e5e4d51 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.001574737 +0000 UTC m=+2.511293703,LastTimestamp:2026-03-20 13:21:43.001574737 +0000 UTC m=+2.511293703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.827694 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f560e77016c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.003193708 +0000 UTC m=+2.512912674,LastTimestamp:2026-03-20 13:21:43.003193708 +0000 UTC m=+2.512912674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.833989 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f56188e2f64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.172484964 +0000 UTC m=+2.682203930,LastTimestamp:2026-03-20 13:21:43.172484964 +0000 UTC m=+2.682203930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.838871 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f56192d6c41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.182920769 +0000 UTC m=+2.692639735,LastTimestamp:2026-03-20 13:21:43.182920769 +0000 UTC m=+2.692639735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.846073 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f561c0746f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.230752502 +0000 UTC m=+2.740471508,LastTimestamp:2026-03-20 13:21:43.230752502 +0000 UTC m=+2.740471508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.851307 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f561c27526b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.232852587 +0000 UTC m=+2.742571553,LastTimestamp:2026-03-20 13:21:43.232852587 +0000 UTC m=+2.742571553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.857029 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f561c488c30 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.235030064 +0000 UTC m=+2.744749030,LastTimestamp:2026-03-20 13:21:43.235030064 +0000 UTC m=+2.744749030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.862697 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f561cbc46c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.242614468 +0000 UTC m=+2.752333434,LastTimestamp:2026-03-20 13:21:43.242614468 +0000 UTC m=+2.752333434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.868736 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f562ce669b0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.513811376 +0000 UTC m=+3.023530332,LastTimestamp:2026-03-20 13:21:43.513811376 +0000 UTC m=+3.023530332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.873476 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f562d411eb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.519755958 +0000 UTC m=+3.029474934,LastTimestamp:2026-03-20 13:21:43.519755958 +0000 UTC m=+3.029474934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.877844 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f562d622c5f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.521922143 +0000 UTC m=+3.031641109,LastTimestamp:2026-03-20 13:21:43.521922143 +0000 UTC m=+3.031641109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.884708 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f562d6504d1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.522108625 +0000 UTC m=+3.031827591,LastTimestamp:2026-03-20 13:21:43.522108625 +0000 UTC m=+3.031827591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.889965 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f562dcafce9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.528791273 +0000 UTC m=+3.038510259,LastTimestamp:2026-03-20 13:21:43.528791273 +0000 UTC m=+3.038510259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.896195 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f562ddea220 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.530078752 +0000 UTC m=+3.039797718,LastTimestamp:2026-03-20 13:21:43.530078752 +0000 UTC m=+3.039797718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.902493 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f562e5a3f64 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.53817994 +0000 UTC m=+3.047898906,LastTimestamp:2026-03-20 13:21:43.53817994 +0000 UTC m=+3.047898906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.908798 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f562f0b308d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.549776013 +0000 UTC m=+3.059494979,LastTimestamp:2026-03-20 13:21:43.549776013 +0000 UTC m=+3.059494979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.914424 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f562f479cf2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.553735922 +0000 UTC m=+3.063454878,LastTimestamp:2026-03-20 13:21:43.553735922 +0000 UTC m=+3.063454878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.918517 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f562f5cd45a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.555126362 +0000 UTC m=+3.064845328,LastTimestamp:2026-03-20 13:21:43.555126362 +0000 UTC m=+3.064845328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.924814 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f563b2a708d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.753150605 +0000 UTC m=+3.262869571,LastTimestamp:2026-03-20 13:21:43.753150605 +0000 UTC m=+3.262869571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.929525 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f563b54ad2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.755918635 +0000 UTC m=+3.265637601,LastTimestamp:2026-03-20 13:21:43.755918635 +0000 UTC m=+3.265637601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.936277 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f563bc7bfee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.763460078 +0000 UTC m=+3.273179044,LastTimestamp:2026-03-20 13:21:43.763460078 +0000 UTC m=+3.273179044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.940838 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f563be2be0f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.765229071 +0000 UTC m=+3.274948047,LastTimestamp:2026-03-20 13:21:43.765229071 +0000 UTC m=+3.274948047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.946114 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f563c2cd1e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.770083816 +0000 UTC m=+3.279802782,LastTimestamp:2026-03-20 13:21:43.770083816 +0000 UTC m=+3.279802782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.952058 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f563c828693 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.775700627 +0000 UTC m=+3.285419603,LastTimestamp:2026-03-20 13:21:43.775700627 +0000 UTC m=+3.285419603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.957641 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5649782e58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.993126488 +0000 UTC m=+3.502845454,LastTimestamp:2026-03-20 13:21:43.993126488 +0000 UTC m=+3.502845454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.962810 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5649912599 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:43.994762649 +0000 UTC m=+3.504481615,LastTimestamp:2026-03-20 13:21:43.994762649 +0000 UTC m=+3.504481615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.969510 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f564a89e993 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.011065747 +0000 UTC m=+3.520784713,LastTimestamp:2026-03-20 13:21:44.011065747 +0000 UTC m=+3.520784713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.975586 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f564a9b38f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.012200185 +0000 UTC m=+3.521919151,LastTimestamp:2026-03-20 13:21:44.012200185 +0000 UTC m=+3.521919151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.981826 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f564aab4ac0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.013253312 +0000 UTC m=+3.522972278,LastTimestamp:2026-03-20 13:21:44.013253312 +0000 UTC m=+3.522972278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.986905 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f565730b1a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.223322531 +0000 UTC m=+3.733041507,LastTimestamp:2026-03-20 13:21:44.223322531 +0000 UTC m=+3.733041507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:05 crc kubenswrapper[4895]: E0320 13:22:05.994132 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f56588c44d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.246101205 +0000 UTC m=+3.755820161,LastTimestamp:2026-03-20 13:21:44.246101205 +0000 UTC m=+3.755820161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.001641 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5658992d8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.24694721 +0000 UTC m=+3.756666166,LastTimestamp:2026-03-20 13:21:44.24694721 +0000 UTC m=+3.756666166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.009143 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f565933faf8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.257092344 +0000 UTC m=+3.766811310,LastTimestamp:2026-03-20 13:21:44.257092344 +0000 UTC m=+3.766811310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.015791 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5665dbece2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.469425378 +0000 UTC m=+3.979144344,LastTimestamp:2026-03-20 13:21:44.469425378 +0000 UTC m=+3.979144344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.021257 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5666011132 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.471859506 +0000 UTC m=+3.981578472,LastTimestamp:2026-03-20 13:21:44.471859506 +0000 UTC m=+3.981578472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.026262 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f566664df39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.478400313 +0000 UTC m=+3.988119279,LastTimestamp:2026-03-20 13:21:44.478400313 +0000 UTC m=+3.988119279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.030788 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5666c14269 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.484455017 +0000 UTC m=+3.994173983,LastTimestamp:2026-03-20 13:21:44.484455017 +0000 UTC m=+3.994173983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.035841 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5695abe942 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.27158509 +0000 UTC m=+4.781304096,LastTimestamp:2026-03-20 13:21:45.27158509 +0000 UTC m=+4.781304096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.041029 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56a3d870d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.509384408 +0000 UTC m=+5.019103384,LastTimestamp:2026-03-20 13:21:45.509384408 +0000 UTC m=+5.019103384,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.045465 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56a490ad5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.521458525 +0000 UTC m=+5.031177501,LastTimestamp:2026-03-20 13:21:45.521458525 +0000 UTC m=+5.031177501,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.050230 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56a4a2794a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.522624842 +0000 UTC m=+5.032343848,LastTimestamp:2026-03-20 13:21:45.522624842 +0000 UTC m=+5.032343848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.055840 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56b2de7be0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.761438688 +0000 UTC m=+5.271157654,LastTimestamp:2026-03-20 13:21:45.761438688 +0000 UTC m=+5.271157654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.059648 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56b3c9ac82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.776852098 +0000 UTC m=+5.286571094,LastTimestamp:2026-03-20 13:21:45.776852098 +0000 UTC m=+5.286571094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.064385 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56b3e0590b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:45.778338059 +0000 UTC m=+5.288057055,LastTimestamp:2026-03-20 13:21:45.778338059 +0000 UTC m=+5.288057055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.070511 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56c1bbd735 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.010826549 +0000 UTC m=+5.520545515,LastTimestamp:2026-03-20 13:21:46.010826549 +0000 UTC m=+5.520545515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.075938 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56c28e388c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.024614028 +0000 UTC m=+5.534333004,LastTimestamp:2026-03-20 13:21:46.024614028 +0000 UTC m=+5.534333004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.082040 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56c2a4d3b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.026095539 +0000 UTC m=+5.535814525,LastTimestamp:2026-03-20 13:21:46.026095539 +0000 UTC m=+5.535814525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.087503 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56d03a449c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.253993116 +0000 UTC m=+5.763712092,LastTimestamp:2026-03-20 13:21:46.253993116 +0000 UTC m=+5.763712092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.093888 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56d179e77c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.274940796 +0000 UTC m=+5.784659772,LastTimestamp:2026-03-20 13:21:46.274940796 +0000 UTC m=+5.784659772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.099642 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56d1966481 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.276807809 +0000 UTC m=+5.786526825,LastTimestamp:2026-03-20 13:21:46.276807809 +0000 UTC m=+5.786526825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.107204 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56df652abf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.508462783 +0000 UTC m=+6.018181749,LastTimestamp:2026-03-20 13:21:46.508462783 +0000 UTC m=+6.018181749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.112776 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f56e02e4c60 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:46.521644128 +0000 UTC m=+6.031363084,LastTimestamp:2026-03-20 13:21:46.521644128 +0000 UTC m=+6.031363084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.124272 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:22:06 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f584342b49e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:06 crc kubenswrapper[4895]: body: Mar 20 13:22:06 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:52.478893214 +0000 UTC m=+11.988612220,LastTimestamp:2026-03-20 13:21:52.478893214 +0000 UTC m=+11.988612220,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:06 crc kubenswrapper[4895]: > Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.130441 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f584343d678 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:52.478967416 +0000 UTC m=+11.988686422,LastTimestamp:2026-03-20 13:21:52.478967416 +0000 UTC m=+11.988686422,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: I0320 13:22:06.137738 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.137849 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f5658992d8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5658992d8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.24694721 +0000 UTC m=+3.756666166,LastTimestamp:2026-03-20 13:21:55.33967337 +0000 UTC m=+14.849392376,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.142037 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:22:06 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f58f4085fa7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:06 crc kubenswrapper[4895]: body: Mar 20 13:22:06 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:55.444637607 +0000 UTC m=+14.954356583,LastTimestamp:2026-03-20 13:21:55.444637607 +0000 UTC m=+14.954356583,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:06 crc kubenswrapper[4895]: > Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.147127 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f58f4093d9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:55.444694429 +0000 UTC m=+14.954413415,LastTimestamp:2026-03-20 13:21:55.444694429 +0000 UTC m=+14.954413415,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.152765 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:22:06 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f58f9381255 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:22:06 crc kubenswrapper[4895]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:22:06 crc kubenswrapper[4895]: Mar 20 13:22:06 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:55.531649621 +0000 UTC m=+15.041368587,LastTimestamp:2026-03-20 13:21:55.531649621 +0000 UTC m=+15.041368587,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:06 crc kubenswrapper[4895]: > Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.158270 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f58f938a13a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:55.531686202 +0000 UTC m=+15.041405168,LastTimestamp:2026-03-20 13:21:55.531686202 +0000 UTC m=+15.041405168,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.164017 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f5665dbece2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5665dbece2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.469425378 +0000 UTC m=+3.979144344,LastTimestamp:2026-03-20 13:21:55.600989728 +0000 UTC m=+15.110708694,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.169133 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f566664df39\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f566664df39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:44.478400313 +0000 UTC m=+3.988119279,LastTimestamp:2026-03-20 13:21:55.614598603 +0000 UTC m=+15.124317569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.174203 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:22:06 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a97667687 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:06 crc kubenswrapper[4895]: body: Mar 20 13:22:06 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480457351 +0000 UTC m=+21.990176317,LastTimestamp:2026-03-20 13:22:02.480457351 +0000 UTC m=+21.990176317,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:06 crc kubenswrapper[4895]: > Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.179178 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a9767ae0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480537103 +0000 UTC m=+21.990256069,LastTimestamp:2026-03-20 13:22:02.480537103 +0000 UTC m=+21.990256069,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:06 crc kubenswrapper[4895]: W0320 13:22:06.541652 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:06 crc kubenswrapper[4895]: E0320 13:22:06.541721 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:07 crc kubenswrapper[4895]: I0320 13:22:07.146877 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.140709 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.965545 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.967430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.967497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.967525 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:08 crc kubenswrapper[4895]: I0320 13:22:08.967585 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:08 crc kubenswrapper[4895]: E0320 13:22:08.974176 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:08 crc kubenswrapper[4895]: E0320 13:22:08.980750 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:09 crc kubenswrapper[4895]: I0320 13:22:09.143771 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:10 crc kubenswrapper[4895]: I0320 13:22:10.143131 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:10 crc kubenswrapper[4895]: W0320 13:22:10.830593 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:22:10 crc kubenswrapper[4895]: E0320 13:22:10.830706 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:11 crc kubenswrapper[4895]: I0320 13:22:11.138300 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:11 crc kubenswrapper[4895]: E0320 13:22:11.304544 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.144812 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.479275 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.479614 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.479710 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.479933 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.482260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.482341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.482364 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.483301 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f6947eccf2e6472e6aadd8de5a75013a05892dd37296d7c39c4953ee9d228fdb"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:22:12 crc kubenswrapper[4895]: I0320 13:22:12.483830 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f6947eccf2e6472e6aadd8de5a75013a05892dd37296d7c39c4953ee9d228fdb" gracePeriod=30 Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.487858 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5a97667687\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:22:12 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a97667687 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:12 crc kubenswrapper[4895]: body: Mar 20 13:22:12 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480457351 +0000 UTC m=+21.990176317,LastTimestamp:2026-03-20 13:22:12.479571162 +0000 UTC m=+31.989290168,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:12 crc kubenswrapper[4895]: > Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.493689 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5a9767ae0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a9767ae0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480537103 +0000 UTC m=+21.990256069,LastTimestamp:2026-03-20 13:22:12.479662455 +0000 UTC m=+31.989381461,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.501796 4895 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5ceba53ec0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:12.483792576 +0000 UTC m=+31.993511612,LastTimestamp:2026-03-20 13:22:12.483792576 +0000 UTC m=+31.993511612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.623587 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f55e90d0b27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55e90d0b27 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.375492391 +0000 UTC m=+1.885211447,LastTimestamp:2026-03-20 13:22:12.616459396 +0000 UTC m=+32.126178402,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.869482 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f55ff06567f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55ff06567f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.744151679 +0000 UTC m=+2.253870675,LastTimestamp:2026-03-20 13:22:12.863942694 +0000 UTC m=+32.373661670,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:12 crc kubenswrapper[4895]: E0320 13:22:12.882015 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f55ffd5fe1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f55ffd5fe1e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:42.757760542 +0000 UTC m=+2.267479548,LastTimestamp:2026-03-20 13:22:12.879258078 +0000 UTC m=+32.388977054,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.144483 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.405105 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.405536 4895 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f6947eccf2e6472e6aadd8de5a75013a05892dd37296d7c39c4953ee9d228fdb" exitCode=255 Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.405576 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f6947eccf2e6472e6aadd8de5a75013a05892dd37296d7c39c4953ee9d228fdb"} Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.405601 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"61495ce596eb916ab8bfc2e9b9637df01765b5bd7020b3d0807842a1227dc334"} Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.405689 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.406898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.406953 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:13 crc kubenswrapper[4895]: I0320 13:22:13.406974 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.144007 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.211382 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.213038 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.213091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.213112 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:14 crc kubenswrapper[4895]: I0320 13:22:14.214029 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.142072 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.414916 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.415705 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.418194 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" exitCode=255 Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.418268 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa"} Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.418340 4895 scope.go:117] "RemoveContainer" containerID="a7cc7dfd5d1dee853e87a3b6978e1f5fd53050f80a08f213767817b4c181d468" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.418551 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.419858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.419913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.419931 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.420780 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:15 crc kubenswrapper[4895]: E0320 13:22:15.421093 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.572755 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.572989 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.574805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.574868 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.574886 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.974887 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.976799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.976871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.976888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:15 crc kubenswrapper[4895]: I0320 13:22:15.976929 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:15 crc kubenswrapper[4895]: E0320 13:22:15.984722 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:15 crc kubenswrapper[4895]: E0320 13:22:15.985140 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:16 crc kubenswrapper[4895]: I0320 13:22:16.143484 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:16 crc kubenswrapper[4895]: I0320 13:22:16.424419 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.144557 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.807597 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.808000 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.809943 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.810020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.810039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:17 crc kubenswrapper[4895]: I0320 13:22:17.810967 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:17 crc kubenswrapper[4895]: E0320 13:22:17.811259 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:18 crc kubenswrapper[4895]: I0320 13:22:18.143531 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.141847 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.479224 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.479505 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.481277 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.481324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:19 crc kubenswrapper[4895]: I0320 13:22:19.481340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:20 crc kubenswrapper[4895]: I0320 13:22:20.144776 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:21 crc kubenswrapper[4895]: I0320 13:22:21.146359 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:21 crc kubenswrapper[4895]: E0320 13:22:21.305493 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.143731 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.480318 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.480460 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:22:22 crc kubenswrapper[4895]: E0320 13:22:22.482795 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5a97667687\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:22:22 crc kubenswrapper[4895]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a97667687 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:22 crc kubenswrapper[4895]: body: Mar 20 13:22:22 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480457351 +0000 UTC m=+21.990176317,LastTimestamp:2026-03-20 13:22:22.480430277 +0000 UTC m=+41.990149253,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:22 crc kubenswrapper[4895]: > Mar 20 13:22:22 crc kubenswrapper[4895]: E0320 13:22:22.488811 4895 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5a9767ae0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5a9767ae0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:02.480537103 +0000 UTC m=+21.990256069,LastTimestamp:2026-03-20 13:22:22.480503769 +0000 UTC m=+41.990222745,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.985513 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.987216 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.987282 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.987307 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:22 crc kubenswrapper[4895]: I0320 13:22:22.987352 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:22 crc kubenswrapper[4895]: E0320 13:22:22.996953 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:22 crc kubenswrapper[4895]: E0320 13:22:22.997018 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.144064 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.177518 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.177762 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.179216 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.179283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.179302 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:23 crc kubenswrapper[4895]: I0320 13:22:23.180272 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:23 crc kubenswrapper[4895]: E0320 13:22:23.180589 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:24 crc kubenswrapper[4895]: I0320 13:22:24.144781 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:24 crc kubenswrapper[4895]: W0320 13:22:24.607068 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:24 crc kubenswrapper[4895]: E0320 13:22:24.607155 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:25 crc kubenswrapper[4895]: I0320 13:22:25.142793 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:26 crc kubenswrapper[4895]: I0320 13:22:26.141375 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:27 crc kubenswrapper[4895]: I0320 13:22:27.141021 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:27 crc kubenswrapper[4895]: W0320 13:22:27.844301 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:22:27 crc kubenswrapper[4895]: E0320 13:22:27.844438 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:28 crc kubenswrapper[4895]: I0320 13:22:28.142721 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.143009 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.483162 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.483329 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.484347 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.484411 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.484426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.486555 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.618085 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.619697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.619746 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.619759 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.998085 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.999181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.999238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.999252 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4895]: I0320 13:22:29.999279 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:30 crc kubenswrapper[4895]: E0320 13:22:30.003577 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:30 crc kubenswrapper[4895]: E0320 13:22:30.003659 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:30 crc kubenswrapper[4895]: I0320 13:22:30.140885 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:30 crc kubenswrapper[4895]: W0320 13:22:30.291749 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:22:30 crc kubenswrapper[4895]: E0320 13:22:30.291804 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:31 crc kubenswrapper[4895]: I0320 13:22:31.143719 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:31 crc kubenswrapper[4895]: E0320 13:22:31.305645 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4895]: I0320 13:22:32.141034 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.143037 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.381379 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.381719 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.383577 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.383629 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:33 crc kubenswrapper[4895]: I0320 13:22:33.383648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:33 crc kubenswrapper[4895]: W0320 13:22:33.784629 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:22:33 crc kubenswrapper[4895]: E0320 13:22:33.784688 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.143790 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.210794 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.212231 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.212288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.212307 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:34 crc kubenswrapper[4895]: I0320 13:22:34.213125 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:34 crc kubenswrapper[4895]: E0320 13:22:34.213435 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:35 crc kubenswrapper[4895]: I0320 13:22:35.141705 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:36 crc kubenswrapper[4895]: I0320 13:22:36.141476 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.004005 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.006004 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.006057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.006071 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.006104 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:37 crc kubenswrapper[4895]: E0320 13:22:37.013212 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:37 crc kubenswrapper[4895]: E0320 13:22:37.013252 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:37 crc kubenswrapper[4895]: I0320 13:22:37.144287 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:38 crc kubenswrapper[4895]: I0320 13:22:38.143486 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:39 crc kubenswrapper[4895]: I0320 13:22:39.144727 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:40 crc kubenswrapper[4895]: I0320 13:22:40.144081 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:41 crc kubenswrapper[4895]: I0320 13:22:41.143996 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:41 crc kubenswrapper[4895]: E0320 13:22:41.306213 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4895]: I0320 13:22:42.141276 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:43 crc kubenswrapper[4895]: I0320 13:22:43.144737 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.014460 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.016536 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.016613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.016648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.016701 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:44 crc kubenswrapper[4895]: E0320 13:22:44.023106 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:44 crc kubenswrapper[4895]: E0320 13:22:44.023117 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:44 crc kubenswrapper[4895]: I0320 13:22:44.142945 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:45 crc kubenswrapper[4895]: I0320 13:22:45.142437 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:45 crc kubenswrapper[4895]: I0320 13:22:45.697205 4895 csr.go:261] certificate signing request csr-6vpqj is approved, waiting to be issued Mar 20 13:22:45 crc kubenswrapper[4895]: I0320 13:22:45.705979 4895 csr.go:257] certificate signing request csr-6vpqj is issued Mar 20 13:22:45 crc kubenswrapper[4895]: I0320 13:22:45.781695 4895 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:22:45 crc kubenswrapper[4895]: I0320 13:22:45.976149 4895 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:22:46 crc kubenswrapper[4895]: I0320 13:22:46.707891 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 22:19:32.20504445 +0000 UTC Mar 20 13:22:46 crc kubenswrapper[4895]: I0320 13:22:46.707926 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6320h56m45.497121161s for next certificate rotation Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.211210 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.212485 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.212517 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.212526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.213100 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.665489 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.667001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b"} Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.667150 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.667897 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.667921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.667930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4895]: I0320 13:22:47.807136 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.671295 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.672235 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.674475 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" exitCode=255 Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.674537 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b"} Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.674594 4895 scope.go:117] "RemoveContainer" containerID="0ace5ce5fd5dfaf0783b8bb7617c8ec08946c304cadec66f9c73a9127630cbfa" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.674872 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.676502 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.676544 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.676564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4895]: I0320 13:22:48.677660 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:22:48 crc kubenswrapper[4895]: E0320 13:22:48.677924 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.679950 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.683264 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.684898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.684969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.684995 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4895]: I0320 13:22:49.686202 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:22:49 crc kubenswrapper[4895]: E0320 13:22:49.686561 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.023878 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.025758 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.025835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.025858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.026000 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.038853 4895 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.038939 4895 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.038959 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.043013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.043082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.043110 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.043140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.043163 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.065278 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.077200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.077262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.077279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.077304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.077322 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.097471 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.107529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.107562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.107572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.107587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.107598 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.123447 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.133611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.133645 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.133656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.133673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4895]: I0320 13:22:51.133685 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.145474 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.145705 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.145753 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.245946 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.306475 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.346703 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.447668 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.547769 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.648099 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.749035 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.850337 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:51 crc kubenswrapper[4895]: E0320 13:22:51.951064 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.052055 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.152191 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.253379 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.354381 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.454635 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.555859 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.656685 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.757329 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.859245 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:52 crc kubenswrapper[4895]: E0320 13:22:52.960356 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.061242 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.161556 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.176764 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.176945 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.177987 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.178011 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.178020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4895]: I0320 13:22:53.178465 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.178607 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.262616 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.363541 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.464500 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.565352 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.665511 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.765778 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.866287 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:53 crc kubenswrapper[4895]: E0320 13:22:53.967161 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.067890 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.169851 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.270139 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.371169 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.471705 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.572372 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.672947 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.773876 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.874845 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:54 crc kubenswrapper[4895]: E0320 13:22:54.975476 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.075590 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.176694 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.277420 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.378364 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.479156 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.580020 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.680725 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.781501 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.882616 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:55 crc kubenswrapper[4895]: E0320 13:22:55.983777 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.084077 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: I0320 13:22:56.120602 4895 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.185195 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.286231 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.386947 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.487037 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.587889 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.688042 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.788506 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.889246 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:56 crc kubenswrapper[4895]: E0320 13:22:56.990175 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.091539 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.192245 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.293186 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.394264 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.495424 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.596180 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.696311 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.797638 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:57 crc kubenswrapper[4895]: E0320 13:22:57.898775 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:57.999940 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.100829 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.201846 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.302825 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.404043 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.504610 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.604960 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.705813 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.806685 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:58 crc kubenswrapper[4895]: E0320 13:22:58.907330 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.008441 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.109479 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.209858 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.310914 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.411534 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.512267 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.612730 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.713463 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.814326 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:59 crc kubenswrapper[4895]: E0320 13:22:59.915356 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.016200 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.117267 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: I0320 13:23:00.211563 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:00 crc kubenswrapper[4895]: I0320 13:23:00.212885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4895]: I0320 13:23:00.213039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4895]: I0320 13:23:00.213064 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.217841 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.318171 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.419130 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.520114 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.621102 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.722195 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.823278 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:00 crc kubenswrapper[4895]: E0320 13:23:00.924008 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.025186 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.126216 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.226734 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.306651 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.327847 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.429057 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.530126 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.538545 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.545576 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.545668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.545698 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.545731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.545753 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.563616 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.569721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.569803 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.569822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.569854 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.569879 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.585713 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.591067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.591104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.591113 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.591128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.591143 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.604777 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.617659 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.617730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.617751 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.617779 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4895]: I0320 13:23:01.617799 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.636299 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cedb54ff-0ea2-432e-bafc-4f3a8bf58c53\\\",\\\"systemUUID\\\":\\\"2d141aca-ef91-4eca-959b-e9b486ead362\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.636585 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.636629 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.737157 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.838307 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:01 crc kubenswrapper[4895]: E0320 13:23:01.938716 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.039134 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.140068 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.240304 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.340745 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.441455 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.542517 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.642937 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.743878 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.845016 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:02 crc kubenswrapper[4895]: E0320 13:23:02.945799 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.046178 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.146825 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.247171 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.347853 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.448692 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.549655 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.650309 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.751087 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.851284 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:03 crc kubenswrapper[4895]: E0320 13:23:03.952331 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.052817 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.153556 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.254149 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.355100 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.455720 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.556816 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.657569 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.758369 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.858555 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:04 crc kubenswrapper[4895]: E0320 13:23:04.959729 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.060105 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.160663 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: I0320 13:23:05.211683 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4895]: I0320 13:23:05.213188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4895]: I0320 13:23:05.213257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4895]: I0320 13:23:05.213275 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.261657 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.362804 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.463364 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.563993 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.665099 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.765598 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.865865 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: E0320 13:23:05.967011 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4895]: I0320 13:23:05.984653 4895 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.067332 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.167861 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: I0320 13:23:06.211774 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4895]: I0320 13:23:06.213900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4895]: I0320 13:23:06.213980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4895]: I0320 13:23:06.213992 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4895]: I0320 13:23:06.215048 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.215281 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.268025 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.368231 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.469083 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.570220 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.671136 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.772065 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.872953 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:06 crc kubenswrapper[4895]: E0320 13:23:06.974082 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:07 crc kubenswrapper[4895]: E0320 13:23:07.074513 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:07 crc kubenswrapper[4895]: E0320 13:23:07.175343 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:07 crc kubenswrapper[4895]: E0320 13:23:07.275730 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.347490 4895 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.378781 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.378862 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.378907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.378941 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.378965 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.482913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.482996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.483012 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.483039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.483057 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.588680 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.588817 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.588835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.588871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.588888 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.692130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.692183 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.692206 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.692237 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.692260 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.795446 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.795562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.795579 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.795602 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.795617 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.899339 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.899452 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.899510 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.899544 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4895]: I0320 13:23:07.899568 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.003499 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.003598 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.003614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.003644 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.003661 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.106735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.106789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.106814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.106839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.106856 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.161867 4895 apiserver.go:52] "Watching apiserver" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.168189 4895 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.168646 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.169109 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.169134 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.169197 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.169968 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.170167 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.170753 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.172445 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.172543 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.172639 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.172725 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.172805 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.173088 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.174242 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.174601 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.174806 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.174857 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.175050 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.175208 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.209673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.209805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.209829 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.209858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.209880 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.212759 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.234230 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.247002 4895 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.249836 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.265688 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.283749 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.297077 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313229 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313258 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313614 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.313840 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.328153 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.331224 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.331509 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.331685 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.331824 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.331864 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.332200 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.332342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.332490 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.332818 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.332929 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333066 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333108 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333445 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333586 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333769 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333936 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.333978 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334014 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334124 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334157 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334192 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334261 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334354 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334416 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334449 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334481 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334517 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334557 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334589 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334621 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334690 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334727 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334760 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334764 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334829 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334861 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334927 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334959 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.334990 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335123 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335159 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335191 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335227 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335303 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335336 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335370 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335433 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335513 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335551 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335586 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335617 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335650 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335686 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335429 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335720 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335443 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335758 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335795 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335865 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335898 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335935 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335971 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336008 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336042 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336078 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336113 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336149 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336182 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336218 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336290 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336444 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336481 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336525 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336559 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336592 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336628 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336668 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336702 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336736 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336772 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336809 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336844 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336879 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336914 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336948 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336984 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337157 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337191 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337230 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337266 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337305 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337340 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337375 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337448 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337552 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337590 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337662 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337698 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337735 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337771 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337806 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337878 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337913 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337956 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337998 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338034 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338070 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338106 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338141 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338178 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338288 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338325 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338363 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338468 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338505 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338542 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338582 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338623 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338661 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338698 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338734 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338779 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338815 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338854 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338890 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338926 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338966 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339014 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339054 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339094 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339132 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339170 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339207 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339285 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339321 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339457 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339498 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339536 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339607 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339648 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339687 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339728 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339764 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339801 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339837 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339874 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339914 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339952 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339991 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340029 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340066 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340105 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340145 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340184 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340223 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340304 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340378 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340457 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340501 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340551 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340595 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340637 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340675 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340754 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340825 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340883 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340944 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340994 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341034 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341075 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341116 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341156 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341268 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341484 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341533 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341573 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341612 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341686 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341732 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341955 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335666 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348538 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335856 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335953 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.335987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.336228 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337081 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337988 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.337981 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338148 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338249 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338287 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338467 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338554 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338740 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.338924 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339300 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339546 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339614 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339606 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339885 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.339893 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340525 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340558 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.340681 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341154 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341199 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341289 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349655 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341352 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.341584 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.342091 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:08.842051337 +0000 UTC m=+88.351770353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350083 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350197 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350224 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350248 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350271 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350293 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350314 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350337 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350359 4895 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350382 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350458 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350480 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349802 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.349948 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.351033 4895 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.351160 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:08.851134215 +0000 UTC m=+88.360853221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.351541 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352012 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352052 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352377 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352856 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352823 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352975 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.352987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342348 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342561 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342705 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342777 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342940 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343116 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343134 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343322 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343720 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343730 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343772 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343910 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343930 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.343926 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.344439 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.344813 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.345231 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.344989 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.345276 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.345473 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.345594 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.345900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.353974 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.346170 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.346199 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.346444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.346577 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.346983 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347065 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347334 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347348 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347492 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347703 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347804 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.347815 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348116 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348221 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348268 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.348305 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349131 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349550 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349589 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.349623 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350092 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.350211 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350314 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.350764 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.342193 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.353656 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.353811 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.354315 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.353477 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.354535 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.354778 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.355339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.355819 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:08.855736174 +0000 UTC m=+88.365455180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.359545 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.359787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.360025 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.360157 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.359854 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.360854 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.361171 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.361239 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.361434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.361459 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.366211 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.370338 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.371029 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.371746 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.372100 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.373720 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.373839 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.373923 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.374304 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.374429 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.374504 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.374636 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:08.874615754 +0000 UTC m=+88.384334730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.374783 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:08.874771147 +0000 UTC m=+88.384490123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.376639 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377233 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377257 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377267 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377291 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377558 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377581 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377745 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.377756 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.378012 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.378363 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.378534 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.378885 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379215 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379269 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379307 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379652 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379660 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379743 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.379827 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380223 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380656 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380748 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.380935 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.381007 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.381843 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.381951 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.384940 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.385981 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.386589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.388984 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.389140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.389292 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.391468 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.392829 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.394482 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.394502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.394697 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.394702 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.395518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.395527 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.395599 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.395968 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.396016 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.396053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.395381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.396971 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.397416 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.397522 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.397766 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.397949 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.398030 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.398145 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.399434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.399537 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.399738 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.400115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.400140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.400342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.400526 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.401146 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.407535 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.409017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.415476 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.417601 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.417647 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.417663 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.417686 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.417700 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.421513 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.438608 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451509 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451593 4895 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451607 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451617 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451626 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451617 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451634 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451697 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451711 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451711 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451726 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451744 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451768 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451785 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451803 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451815 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451829 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451840 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451851 4895 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451862 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451873 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451885 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451896 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451908 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451919 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451930 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451942 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451954 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451968 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451979 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.451990 4895 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452003 4895 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452016 4895 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452027 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452044 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452055 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452066 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452079 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452090 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452102 4895 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452113 4895 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452123 4895 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452135 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452145 4895 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452157 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452168 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452179 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452189 4895 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452200 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452212 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452222 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452233 4895 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452243 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452255 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452266 4895 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452277 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452288 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452301 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452313 4895 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452323 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452334 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452346 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452358 4895 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452370 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452381 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452426 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452438 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452451 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452466 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452479 4895 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452490 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452503 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452535 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452547 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452559 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452571 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452583 4895 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452595 4895 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452606 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452617 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452655 4895 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452669 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452681 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452693 4895 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452706 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452718 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452731 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452743 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452755 4895 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452768 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452780 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452793 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452805 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452818 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452829 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452841 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452855 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452867 4895 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452879 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452891 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452902 4895 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452912 4895 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452924 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452934 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452945 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452963 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452973 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452984 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.452994 4895 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453005 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453017 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453029 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453040 4895 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453050 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453060 4895 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453071 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453083 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453094 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453106 4895 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453118 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453129 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453141 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453152 4895 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453163 4895 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453174 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453184 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453194 4895 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453205 4895 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453216 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453227 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453238 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453249 4895 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453260 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453271 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453282 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453292 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453302 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453316 4895 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453326 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453337 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453348 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453358 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453369 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453379 4895 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453407 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453418 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453428 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453440 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453450 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453460 4895 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453472 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453483 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453496 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453507 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453517 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453528 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453539 4895 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453553 4895 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453568 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453582 4895 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453597 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453610 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453624 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453636 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453649 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453660 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453671 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453682 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453692 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453706 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453716 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453727 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453737 4895 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453747 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453757 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453769 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453779 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453789 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453800 4895 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453811 4895 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453832 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453844 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453854 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453865 4895 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453875 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.453886 4895 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.498511 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.514309 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.523765 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.523805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.523818 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.523835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.523846 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.528181 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:23:08 crc kubenswrapper[4895]: W0320 13:23:08.542849 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-eca9de501988317d48e05136ae6e09f3e523d73c591209bbb5d06498782592e8 WatchSource:0}: Error finding container eca9de501988317d48e05136ae6e09f3e523d73c591209bbb5d06498782592e8: Status 404 returned error can't find the container with id eca9de501988317d48e05136ae6e09f3e523d73c591209bbb5d06498782592e8 Mar 20 13:23:08 crc kubenswrapper[4895]: W0320 13:23:08.544661 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-66ea8260895234a0942f19755c78244e8dcf6ae295970a38828a98edd3c29ba5 WatchSource:0}: Error finding container 66ea8260895234a0942f19755c78244e8dcf6ae295970a38828a98edd3c29ba5: Status 404 returned error can't find the container with id 66ea8260895234a0942f19755c78244e8dcf6ae295970a38828a98edd3c29ba5 Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.625748 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.625797 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.625808 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.625823 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.625835 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.728382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.728461 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.728476 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.728494 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.728507 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.734230 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e41187377884884acab22b9615aafff540c7cda8ca2cff07277a5b760eb62522"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.734282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eca9de501988317d48e05136ae6e09f3e523d73c591209bbb5d06498782592e8"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.735616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"75f4921d5cbf9254dfc692f6c2c76c045b7c819099e4e4668433dce9efe8c37a"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.737083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"66ea8260895234a0942f19755c78244e8dcf6ae295970a38828a98edd3c29ba5"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.834030 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.834089 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.834102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.834128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.834141 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.856684 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.856795 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:09.856776598 +0000 UTC m=+89.366495564 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.856827 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.856869 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.856943 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.856962 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.856987 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:09.856979802 +0000 UTC m=+89.366698768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.857003 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:09.856997632 +0000 UTC m=+89.366716598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.936783 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.936832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.936844 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.936869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.936881 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.957517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:08 crc kubenswrapper[4895]: I0320 13:23:08.957582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957748 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957767 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957782 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957855 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:09.957837119 +0000 UTC m=+89.467556085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957884 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957939 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.957954 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:08 crc kubenswrapper[4895]: E0320 13:23:08.958074 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:09.958046223 +0000 UTC m=+89.467765339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.039812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.040140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.040149 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.040165 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.040178 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.144271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.144317 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.144329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.144346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.144359 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.216828 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.217585 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.219505 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.220375 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.222056 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.222904 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.224058 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.225670 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.226943 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.228649 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.229605 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.232136 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.232899 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.233640 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.234938 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.235697 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.237076 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.238837 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.239845 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.241288 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.241990 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.243350 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.244039 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.245610 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.246262 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247446 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247481 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.247988 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.249762 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.250553 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.252363 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.253353 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.255034 4895 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.255215 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.258248 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.259753 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.260350 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.262997 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.263955 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.265275 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.266436 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.268338 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.268990 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.270086 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.270817 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.271922 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.272379 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.273450 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.273918 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.275234 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.275874 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.276829 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.277587 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.278435 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.279652 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.280275 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.349548 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.349597 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.349605 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.349619 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.349628 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.452339 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.452470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.452490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.452547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.452565 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.556281 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.556320 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.556329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.556345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.556355 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.658830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.658898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.658920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.658946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.658963 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.746134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e442adb2e4aa0bd90f0c88fab2119ff563ed2a1e901ab5592d97f3bcbe940941"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.747739 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ac922b1c2b079dedeb9bffd73cbed05ba7d57e4c7b75541905b8074c183eece6"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.761710 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.761760 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.761769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.761786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.761798 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.763825 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.788595 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.803472 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e442adb2e4aa0bd90f0c88fab2119ff563ed2a1e901ab5592d97f3bcbe940941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41187377884884acab22b9615aafff540c7cda8ca2cff07277a5b760eb62522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.813873 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.825407 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.836663 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.847151 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.858223 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac922b1c2b079dedeb9bffd73cbed05ba7d57e4c7b75541905b8074c183eece6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.863313 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.863347 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.863356 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.863371 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.863379 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.865602 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.865663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.865702 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.865800 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:11.865776725 +0000 UTC m=+91.375495741 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.865829 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.865830 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.865878 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:11.865867417 +0000 UTC m=+91.375586383 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.865893 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:11.865886677 +0000 UTC m=+91.375605643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.869520 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e442adb2e4aa0bd90f0c88fab2119ff563ed2a1e901ab5592d97f3bcbe940941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41187377884884acab22b9615aafff540c7cda8ca2cff07277a5b760eb62522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.879771 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.890309 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.901896 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.965492 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.965527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.965535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.965548 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.965557 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.966289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:09 crc kubenswrapper[4895]: I0320 13:23:09.966540 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.966456 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.966967 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.967139 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.966593 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.967339 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.967351 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.967648 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:11.967307285 +0000 UTC m=+91.477026291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:09 crc kubenswrapper[4895]: E0320 13:23:09.967822 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:11.967802726 +0000 UTC m=+91.477521722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.067375 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.067464 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.067484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.067510 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.067527 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.170121 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.170167 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.170179 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.170195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.170206 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.211455 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.211515 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.211599 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:10 crc kubenswrapper[4895]: E0320 13:23:10.211617 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:10 crc kubenswrapper[4895]: E0320 13:23:10.211758 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:10 crc kubenswrapper[4895]: E0320 13:23:10.211879 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.272103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.272142 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.272151 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.272165 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.272175 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.374526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.374596 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.374615 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.374643 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.374661 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.476869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.476908 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.476920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.476935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.476948 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.579160 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.579205 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.579217 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.579232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.579247 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.680889 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.680943 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.680959 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.680980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.680997 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.782957 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.782997 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.783009 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.783026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.783039 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.885553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.885616 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.885639 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.885671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.885695 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.988040 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.988088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.988104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.988123 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4895]: I0320 13:23:10.988140 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.091025 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.091088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.091147 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.091177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.091204 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.194104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.194199 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.194251 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.194274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.194362 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.296294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.296328 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.296338 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.296373 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.296383 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.398726 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.398760 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.398769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.398783 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.398791 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.501533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.501583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.501595 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.501612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.501624 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.603493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.603594 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.603612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.603638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.603656 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.706066 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.706128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.706145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.706169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.706190 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.750992 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.751029 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.751039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.751054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.751065 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.753774 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"87cd23a2ad1afb0cf18a0e4342f2d96a92db72e8950527a6da696350c091d32f"} Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.884660 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.884819 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.884871 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.885004 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.884971 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:15.884934353 +0000 UTC m=+95.394653359 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.885058 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.885065 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:15.885048715 +0000 UTC m=+95.394767711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.885168 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:15.885142497 +0000 UTC m=+95.394861493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.986067 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:11 crc kubenswrapper[4895]: I0320 13:23:11.986132 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986263 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986300 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986311 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986319 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986329 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986345 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986478 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:15.986379552 +0000 UTC m=+95.496098518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:11 crc kubenswrapper[4895]: E0320 13:23:11.986512 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:15.986503205 +0000 UTC m=+95.496222171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:12 crc kubenswrapper[4895]: I0320 13:23:12.210974 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:12 crc kubenswrapper[4895]: E0320 13:23:12.211096 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:12 crc kubenswrapper[4895]: I0320 13:23:12.211319 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:12 crc kubenswrapper[4895]: I0320 13:23:12.211409 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:12 crc kubenswrapper[4895]: E0320 13:23:12.211461 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:12 crc kubenswrapper[4895]: E0320 13:23:12.211511 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:12 crc kubenswrapper[4895]: I0320 13:23:12.622047 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:23:12 crc kubenswrapper[4895]: I0320 13:23:12.631127 4895 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:23:14 crc kubenswrapper[4895]: I0320 13:23:14.211114 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:14 crc kubenswrapper[4895]: I0320 13:23:14.211283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:14 crc kubenswrapper[4895]: I0320 13:23:14.211471 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:14 crc kubenswrapper[4895]: E0320 13:23:14.211458 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:14 crc kubenswrapper[4895]: E0320 13:23:14.211690 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:14 crc kubenswrapper[4895]: E0320 13:23:14.211770 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:15 crc kubenswrapper[4895]: I0320 13:23:15.923108 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:15 crc kubenswrapper[4895]: E0320 13:23:15.923328 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:23.92328866 +0000 UTC m=+103.433007666 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:15 crc kubenswrapper[4895]: I0320 13:23:15.923489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:15 crc kubenswrapper[4895]: I0320 13:23:15.923576 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:15 crc kubenswrapper[4895]: E0320 13:23:15.923711 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:15 crc kubenswrapper[4895]: E0320 13:23:15.923713 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:15 crc kubenswrapper[4895]: E0320 13:23:15.923798 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:23.923775981 +0000 UTC m=+103.433494977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:15 crc kubenswrapper[4895]: E0320 13:23:15.923824 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:23.923811871 +0000 UTC m=+103.433530877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:16 crc kubenswrapper[4895]: I0320 13:23:16.024938 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:16 crc kubenswrapper[4895]: I0320 13:23:16.025014 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025184 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025208 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025222 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025229 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025272 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025293 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025295 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:24.025270772 +0000 UTC m=+103.534989748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.025446 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:24.025370424 +0000 UTC m=+103.535089430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:16 crc kubenswrapper[4895]: I0320 13:23:16.211176 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:16 crc kubenswrapper[4895]: I0320 13:23:16.211233 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:16 crc kubenswrapper[4895]: I0320 13:23:16.211185 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.211459 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.211597 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:16 crc kubenswrapper[4895]: E0320 13:23:16.211777 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:18 crc kubenswrapper[4895]: I0320 13:23:18.211294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:18 crc kubenswrapper[4895]: I0320 13:23:18.211462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:18 crc kubenswrapper[4895]: E0320 13:23:18.211582 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:18 crc kubenswrapper[4895]: I0320 13:23:18.211634 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:18 crc kubenswrapper[4895]: E0320 13:23:18.211850 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:18 crc kubenswrapper[4895]: E0320 13:23:18.211965 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:20 crc kubenswrapper[4895]: I0320 13:23:20.211615 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:20 crc kubenswrapper[4895]: I0320 13:23:20.211687 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:20 crc kubenswrapper[4895]: E0320 13:23:20.211807 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:20 crc kubenswrapper[4895]: I0320 13:23:20.211712 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:20 crc kubenswrapper[4895]: E0320 13:23:20.211922 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:20 crc kubenswrapper[4895]: E0320 13:23:20.212065 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.224835 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.225472 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:23:21 crc kubenswrapper[4895]: E0320 13:23:21.225642 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.782692 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:23:21 crc kubenswrapper[4895]: E0320 13:23:21.783023 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.931932 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dv5sj"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.932262 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.936251 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.937635 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.938317 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.947908 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w9jrr"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.948641 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.949540 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-krtrm"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.950427 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.952707 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.952844 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.952986 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.952862 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.958054 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.958849 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5xtn2"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.959273 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5xtn2" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.959659 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.959798 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.959889 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.960155 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.960054 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.961309 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.964460 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.972098 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v6kxx"] Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.978158 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983297 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983308 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983723 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0691d92-79a3-4d54-ae28-08da184e162f-hosts-file\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983896 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xsvl\" (UniqueName: \"kubernetes.io/projected/b0691d92-79a3-4d54-ae28-08da184e162f-kube-api-access-8xsvl\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.983952 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.984110 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.984169 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:23:21 crc kubenswrapper[4895]: I0320 13:23:21.985204 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.058961 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6"] Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.059379 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.061207 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.061711 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.062024 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.063182 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-binary-copy\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085339 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-rootfs\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085384 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-proxy-tls\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085429 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-hostroot\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085547 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085607 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085647 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085689 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-daemon-config\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-multus-certs\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085756 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085788 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085824 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085891 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx9hm\" (UniqueName: \"kubernetes.io/projected/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-kube-api-access-fx9hm\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.085993 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cnibin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086025 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-k8s-cni-cncf-io\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086143 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-kubelet\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086180 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-system-cni-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086217 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-os-release\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086249 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn6xf\" (UniqueName: \"kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-system-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086311 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4q5j\" (UniqueName: \"kubernetes.io/projected/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-kube-api-access-j4q5j\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086342 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086439 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xsvl\" (UniqueName: \"kubernetes.io/projected/b0691d92-79a3-4d54-ae28-08da184e162f-kube-api-access-8xsvl\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086497 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-socket-dir-parent\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086532 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-conf-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086601 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-etc-kubernetes\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086635 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-multus\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086669 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086699 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-os-release\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086729 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-netns\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086760 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086796 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086862 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cni-binary-copy\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086906 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.086972 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0691d92-79a3-4d54-ae28-08da184e162f-hosts-file\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fktrs\" (UniqueName: \"kubernetes.io/projected/a332b86a-c2f9-4702-9829-ba837dc7c404-kube-api-access-fktrs\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087045 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087082 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087108 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-bin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0691d92-79a3-4d54-ae28-08da184e162f-hosts-file\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087154 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-cnibin\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087251 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.087306 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.113338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xsvl\" (UniqueName: \"kubernetes.io/projected/b0691d92-79a3-4d54-ae28-08da184e162f-kube-api-access-8xsvl\") pod \"node-resolver-dv5sj\" (UID: \"b0691d92-79a3-4d54-ae28-08da184e162f\") " pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.160569 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8rkgs"] Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.160943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.163046 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.163632 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.163804 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.165852 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188104 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188186 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-daemon-config\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-multus-certs\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188235 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188254 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188278 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx9hm\" (UniqueName: \"kubernetes.io/projected/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-kube-api-access-fx9hm\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188355 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cnibin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188382 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188475 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-k8s-cni-cncf-io\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188486 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cnibin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188417 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-k8s-cni-cncf-io\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188522 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-multus-certs\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188558 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188571 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188620 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-kubelet\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188593 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-kubelet\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188634 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188684 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-system-cni-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188578 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-system-cni-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188763 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-os-release\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn6xf\" (UniqueName: \"kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-system-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4q5j\" (UniqueName: \"kubernetes.io/projected/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-kube-api-access-j4q5j\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188952 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-daemon-config\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188955 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-system-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.188965 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-socket-dir-parent\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189030 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-socket-dir-parent\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189048 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-conf-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-etc-kubernetes\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189139 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-os-release\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189149 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-conf-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189163 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-multus\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-etc-kubernetes\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189194 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-multus\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189240 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7781484c-a6e3-48e2-b7ce-975d701624c8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189277 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7781484c-a6e3-48e2-b7ce-975d701624c8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-os-release\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-netns\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189416 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189442 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-run-netns\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189453 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-os-release\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189456 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189490 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189492 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189506 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cni-binary-copy\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189594 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fktrs\" (UniqueName: \"kubernetes.io/projected/a332b86a-c2f9-4702-9829-ba837dc7c404-kube-api-access-fktrs\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189641 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189667 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189708 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-bin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189759 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-cnibin\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189806 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7781484c-a6e3-48e2-b7ce-975d701624c8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189831 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-binary-copy\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189897 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-rootfs\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189924 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-proxy-tls\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189954 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.189984 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-hostroot\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190016 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190055 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190107 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190166 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-rootfs\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190279 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-multus-cni-dir\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-cni-binary-copy\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190624 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-hostroot\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190809 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190894 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.190962 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191014 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-host-var-lib-cni-bin\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a332b86a-c2f9-4702-9829-ba837dc7c404-cnibin\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191163 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-binary-copy\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-mcd-auth-proxy-config\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.191962 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a332b86a-c2f9-4702-9829-ba837dc7c404-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.195852 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.195944 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-proxy-tls\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.209163 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4q5j\" (UniqueName: \"kubernetes.io/projected/cab32ac6-a22f-4e11-9eaf-4c50ffbce748-kube-api-access-j4q5j\") pod \"multus-5xtn2\" (UID: \"cab32ac6-a22f-4e11-9eaf-4c50ffbce748\") " pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.211528 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.211689 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.211951 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx9hm\" (UniqueName: \"kubernetes.io/projected/6e9e3134-0fea-4e77-a1e4-e74835ee41e8-kube-api-access-fx9hm\") pod \"machine-config-daemon-w9jrr\" (UID: \"6e9e3134-0fea-4e77-a1e4-e74835ee41e8\") " pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.212106 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.212163 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.212078 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.212376 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.215117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fktrs\" (UniqueName: \"kubernetes.io/projected/a332b86a-c2f9-4702-9829-ba837dc7c404-kube-api-access-fktrs\") pod \"multus-additional-cni-plugins-krtrm\" (UID: \"a332b86a-c2f9-4702-9829-ba837dc7c404\") " pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.216687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn6xf\" (UniqueName: \"kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf\") pod \"ovnkube-node-v6kxx\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.262719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dv5sj" Mar 20 13:23:22 crc kubenswrapper[4895]: W0320 13:23:22.279523 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0691d92_79a3_4d54_ae28_08da184e162f.slice/crio-c1ba6be909e5b1029212b0e75b5b8c3d3e03b09b7d7c1b44362ba8dbdc698ab6 WatchSource:0}: Error finding container c1ba6be909e5b1029212b0e75b5b8c3d3e03b09b7d7c1b44362ba8dbdc698ab6: Status 404 returned error can't find the container with id c1ba6be909e5b1029212b0e75b5b8c3d3e03b09b7d7c1b44362ba8dbdc698ab6 Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292434 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7781484c-a6e3-48e2-b7ce-975d701624c8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292508 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7ec79b-b314-472a-942f-6bf0fecd3318-host\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292633 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292641 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dtnk\" (UniqueName: \"kubernetes.io/projected/6f7ec79b-b314-472a-942f-6bf0fecd3318-kube-api-access-9dtnk\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7781484c-a6e3-48e2-b7ce-975d701624c8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7781484c-a6e3-48e2-b7ce-975d701624c8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292855 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f7ec79b-b314-472a-942f-6bf0fecd3318-serviceca\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292883 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.292958 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7781484c-a6e3-48e2-b7ce-975d701624c8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.293900 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7781484c-a6e3-48e2-b7ce-975d701624c8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.296057 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.297296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7781484c-a6e3-48e2-b7ce-975d701624c8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.311350 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7781484c-a6e3-48e2-b7ce-975d701624c8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7dj6\" (UID: \"7781484c-a6e3-48e2-b7ce-975d701624c8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.313362 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krtrm" Mar 20 13:23:22 crc kubenswrapper[4895]: W0320 13:23:22.314659 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9e3134_0fea_4e77_a1e4_e74835ee41e8.slice/crio-ac826f5917b70cc48fb40e4ad2c7558e1f61ed53986c1d2caa9ef2ffb13eb7f2 WatchSource:0}: Error finding container ac826f5917b70cc48fb40e4ad2c7558e1f61ed53986c1d2caa9ef2ffb13eb7f2: Status 404 returned error can't find the container with id ac826f5917b70cc48fb40e4ad2c7558e1f61ed53986c1d2caa9ef2ffb13eb7f2 Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.328520 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5xtn2" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.337667 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.352927 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt"] Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.353295 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.356415 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.357161 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.372864 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.373202 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-t9xh5"] Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.373701 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.373783 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.393438 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f7ec79b-b314-472a-942f-6bf0fecd3318-serviceca\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.393512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7ec79b-b314-472a-942f-6bf0fecd3318-host\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.393551 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dtnk\" (UniqueName: \"kubernetes.io/projected/6f7ec79b-b314-472a-942f-6bf0fecd3318-kube-api-access-9dtnk\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.393587 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7ec79b-b314-472a-942f-6bf0fecd3318-host\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.394482 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6f7ec79b-b314-472a-942f-6bf0fecd3318-serviceca\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.412105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dtnk\" (UniqueName: \"kubernetes.io/projected/6f7ec79b-b314-472a-942f-6bf0fecd3318-kube-api-access-9dtnk\") pod \"node-ca-8rkgs\" (UID: \"6f7ec79b-b314-472a-942f-6bf0fecd3318\") " pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.477894 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8rkgs" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494181 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhpl\" (UniqueName: \"kubernetes.io/projected/e899877b-fe80-4ace-9b35-41eb7302cf12-kube-api-access-5zhpl\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn8xp\" (UniqueName: \"kubernetes.io/projected/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-kube-api-access-kn8xp\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494321 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.494609 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: W0320 13:23:22.496105 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f7ec79b_b314_472a_942f_6bf0fecd3318.slice/crio-464c459b1a76680af0a661c12bae0717c371b347b76b4ab5e69e6f9bbce9fcb1 WatchSource:0}: Error finding container 464c459b1a76680af0a661c12bae0717c371b347b76b4ab5e69e6f9bbce9fcb1: Status 404 returned error can't find the container with id 464c459b1a76680af0a661c12bae0717c371b347b76b4ab5e69e6f9bbce9fcb1 Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.596816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.596883 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhpl\" (UniqueName: \"kubernetes.io/projected/e899877b-fe80-4ace-9b35-41eb7302cf12-kube-api-access-5zhpl\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.596929 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn8xp\" (UniqueName: \"kubernetes.io/projected/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-kube-api-access-kn8xp\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.596964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.596989 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.597010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.597666 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:22 crc kubenswrapper[4895]: E0320 13:23:22.597843 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs podName:e899877b-fe80-4ace-9b35-41eb7302cf12 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:23.097823125 +0000 UTC m=+102.607542091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs") pod "network-metrics-daemon-t9xh5" (UID: "e899877b-fe80-4ace-9b35-41eb7302cf12") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.598494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.598579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.609994 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.615064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhpl\" (UniqueName: \"kubernetes.io/projected/e899877b-fe80-4ace-9b35-41eb7302cf12-kube-api-access-5zhpl\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.616096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn8xp\" (UniqueName: \"kubernetes.io/projected/d4e23d99-76bd-4135-b5f3-0bb176a9ca29-kube-api-access-kn8xp\") pod \"ovnkube-control-plane-749d76644c-9wqgt\" (UID: \"d4e23d99-76bd-4135-b5f3-0bb176a9ca29\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.700979 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" Mar 20 13:23:22 crc kubenswrapper[4895]: W0320 13:23:22.714742 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e23d99_76bd_4135_b5f3_0bb176a9ca29.slice/crio-21d33bec827e581f6eb3d97cae37bd42672f7166a238a58c15e072726d27e585 WatchSource:0}: Error finding container 21d33bec827e581f6eb3d97cae37bd42672f7166a238a58c15e072726d27e585: Status 404 returned error can't find the container with id 21d33bec827e581f6eb3d97cae37bd42672f7166a238a58c15e072726d27e585 Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.787438 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8rkgs" event={"ID":"6f7ec79b-b314-472a-942f-6bf0fecd3318","Type":"ContainerStarted","Data":"64a0a74e7167a19ab6b5d2fa5ed3445b06938171dc062ce0e0f9a9def1f894fe"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.787503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8rkgs" event={"ID":"6f7ec79b-b314-472a-942f-6bf0fecd3318","Type":"ContainerStarted","Data":"464c459b1a76680af0a661c12bae0717c371b347b76b4ab5e69e6f9bbce9fcb1"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.789179 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" event={"ID":"7781484c-a6e3-48e2-b7ce-975d701624c8","Type":"ContainerStarted","Data":"e5ca40fd36d1e7edda285dc205fc7037dc1a37ee3c900cb2efb694b1b48ef378"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.789238 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" event={"ID":"7781484c-a6e3-48e2-b7ce-975d701624c8","Type":"ContainerStarted","Data":"cb5f217daf68e8655e64f80fc92f92b41d21ececca10fed97832e9ddeef5fe5a"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.794534 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"330c2e2db25b050b0494bfe400c36c1d57ea4e56c1d3a874bf5edb280d6a2b36"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.794612 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.794630 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"ac826f5917b70cc48fb40e4ad2c7558e1f61ed53986c1d2caa9ef2ffb13eb7f2"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.796101 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerStarted","Data":"090ade797d495226dd39ddbcfb5efc8292bb1fee47b60e49b1e667c054c06175"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.796160 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerStarted","Data":"bab9df43ec9fb4d05c2f050dd3e18330b4700cd1df7b1889814c1c0fe931ee10"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.797676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dv5sj" event={"ID":"b0691d92-79a3-4d54-ae28-08da184e162f","Type":"ContainerStarted","Data":"f005779151e8fbbcf76be2d74edf4dd20f10486b0457e82c16da292b3ec3bbb3"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.797743 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dv5sj" event={"ID":"b0691d92-79a3-4d54-ae28-08da184e162f","Type":"ContainerStarted","Data":"c1ba6be909e5b1029212b0e75b5b8c3d3e03b09b7d7c1b44362ba8dbdc698ab6"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.800620 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5xtn2" event={"ID":"cab32ac6-a22f-4e11-9eaf-4c50ffbce748","Type":"ContainerStarted","Data":"c09f78e64ccb74ba75fb916374c78e76f93d74bac69d22c44e8e7d3ecb5ade2e"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.800678 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5xtn2" event={"ID":"cab32ac6-a22f-4e11-9eaf-4c50ffbce748","Type":"ContainerStarted","Data":"1f1fc9e30372052da41ec4701cc08fb447d0e066d122e28486e2cf5ad5f67e55"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.806901 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" event={"ID":"d4e23d99-76bd-4135-b5f3-0bb176a9ca29","Type":"ContainerStarted","Data":"21d33bec827e581f6eb3d97cae37bd42672f7166a238a58c15e072726d27e585"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.808914 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" exitCode=0 Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.808967 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.808999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"939c389635ab19eb220e42525e882c3143bdd62ffee5f02c304fd0b9f3583d8e"} Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.811102 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8rkgs" podStartSLOduration=53.811077188 podStartE2EDuration="53.811077188s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:22.810778882 +0000 UTC m=+102.320497878" watchObservedRunningTime="2026-03-20 13:23:22.811077188 +0000 UTC m=+102.320796154" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.827758 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podStartSLOduration=53.827735299 podStartE2EDuration="53.827735299s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:22.827530975 +0000 UTC m=+102.337249981" watchObservedRunningTime="2026-03-20 13:23:22.827735299 +0000 UTC m=+102.337454265" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.867583 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7dj6" podStartSLOduration=53.867564314 podStartE2EDuration="53.867564314s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:22.866743855 +0000 UTC m=+102.376462831" watchObservedRunningTime="2026-03-20 13:23:22.867564314 +0000 UTC m=+102.377283290" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.890602 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dv5sj" podStartSLOduration=53.890581382 podStartE2EDuration="53.890581382s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:22.89049398 +0000 UTC m=+102.400212966" watchObservedRunningTime="2026-03-20 13:23:22.890581382 +0000 UTC m=+102.400300348" Mar 20 13:23:22 crc kubenswrapper[4895]: I0320 13:23:22.910051 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5xtn2" podStartSLOduration=53.910028934 podStartE2EDuration="53.910028934s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:22.910006373 +0000 UTC m=+102.419725349" watchObservedRunningTime="2026-03-20 13:23:22.910028934 +0000 UTC m=+102.419747900" Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.105040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:23 crc kubenswrapper[4895]: E0320 13:23:23.105176 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:23 crc kubenswrapper[4895]: E0320 13:23:23.105226 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs podName:e899877b-fe80-4ace-9b35-41eb7302cf12 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:24.105210615 +0000 UTC m=+103.614929581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs") pod "network-metrics-daemon-t9xh5" (UID: "e899877b-fe80-4ace-9b35-41eb7302cf12") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.568180 4895 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.814202 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="090ade797d495226dd39ddbcfb5efc8292bb1fee47b60e49b1e667c054c06175" exitCode=0 Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.814322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"090ade797d495226dd39ddbcfb5efc8292bb1fee47b60e49b1e667c054c06175"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.820460 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" event={"ID":"d4e23d99-76bd-4135-b5f3-0bb176a9ca29","Type":"ContainerStarted","Data":"5affebfcb776f45b7983f2c0efa01a55abc3bdf24c924a32ad8d3ecd197fcb53"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.820538 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" event={"ID":"d4e23d99-76bd-4135-b5f3-0bb176a9ca29","Type":"ContainerStarted","Data":"1dd57d017128f900f35ab63fff4fe3891e1820160caf8ba8b8426522d835519d"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826285 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826327 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826355 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826368 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.826379 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:23:23 crc kubenswrapper[4895]: I0320 13:23:23.866847 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9wqgt" podStartSLOduration=53.866826478 podStartE2EDuration="53.866826478s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:23.866784407 +0000 UTC m=+103.376503393" watchObservedRunningTime="2026-03-20 13:23:23.866826478 +0000 UTC m=+103.376545454" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.014106 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.014269 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.014247165 +0000 UTC m=+119.523966131 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.014332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.014375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.014474 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.014499 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.014524 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.014516671 +0000 UTC m=+119.524235637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.014544 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.014532701 +0000 UTC m=+119.524251687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.115809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.115882 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.115929 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.115937 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.115957 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.115970 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116012 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116033 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116043 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116016 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116023 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.116009991 +0000 UTC m=+119.625728957 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116092 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.116082393 +0000 UTC m=+119.625801359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.116103 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs podName:e899877b-fe80-4ace-9b35-41eb7302cf12 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:26.116097853 +0000 UTC m=+105.625816819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs") pod "network-metrics-daemon-t9xh5" (UID: "e899877b-fe80-4ace-9b35-41eb7302cf12") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.211341 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.211472 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.211501 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.211536 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.212036 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.212109 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.212159 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:24 crc kubenswrapper[4895]: E0320 13:23:24.212185 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.832190 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="f2e8c0a90c9464a9c62b36607d29dc1019407586dc238f382f051367af62c70e" exitCode=0 Mar 20 13:23:24 crc kubenswrapper[4895]: I0320 13:23:24.832300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"f2e8c0a90c9464a9c62b36607d29dc1019407586dc238f382f051367af62c70e"} Mar 20 13:23:25 crc kubenswrapper[4895]: I0320 13:23:25.836868 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="f4c54ca4e74b1f361d727eca6287ae3d7c9be501ad99116b9fa23a899dcf5ab5" exitCode=0 Mar 20 13:23:25 crc kubenswrapper[4895]: I0320 13:23:25.836955 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"f4c54ca4e74b1f361d727eca6287ae3d7c9be501ad99116b9fa23a899dcf5ab5"} Mar 20 13:23:25 crc kubenswrapper[4895]: I0320 13:23:25.845343 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.137711 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.137905 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.137992 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs podName:e899877b-fe80-4ace-9b35-41eb7302cf12 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:30.13797482 +0000 UTC m=+109.647693786 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs") pod "network-metrics-daemon-t9xh5" (UID: "e899877b-fe80-4ace-9b35-41eb7302cf12") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.211448 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.211462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.211508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.211545 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.211713 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.211809 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.211904 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:26 crc kubenswrapper[4895]: E0320 13:23:26.212015 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.851431 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="fc900ad2d382375ac895260257e67cea64ec25fe75c5e4dced7f4c0ced4e676a" exitCode=0 Mar 20 13:23:26 crc kubenswrapper[4895]: I0320 13:23:26.851486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"fc900ad2d382375ac895260257e67cea64ec25fe75c5e4dced7f4c0ced4e676a"} Mar 20 13:23:27 crc kubenswrapper[4895]: I0320 13:23:27.865847 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="a199c37888b2705c62e55b02156c2482395c6116c9208497de3dad98423dbaa2" exitCode=0 Mar 20 13:23:27 crc kubenswrapper[4895]: I0320 13:23:27.865936 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"a199c37888b2705c62e55b02156c2482395c6116c9208497de3dad98423dbaa2"} Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.211129 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.211569 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.211197 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.211191 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:28 crc kubenswrapper[4895]: E0320 13:23:28.211653 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:28 crc kubenswrapper[4895]: E0320 13:23:28.211716 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:28 crc kubenswrapper[4895]: E0320 13:23:28.211782 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:28 crc kubenswrapper[4895]: E0320 13:23:28.211835 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.873756 4895 generic.go:334] "Generic (PLEG): container finished" podID="a332b86a-c2f9-4702-9829-ba837dc7c404" containerID="61eee159eb27eba431488c37887e22b5c01534924af81aeb81f9bcbaa168cbea" exitCode=0 Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.873828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerDied","Data":"61eee159eb27eba431488c37887e22b5c01534924af81aeb81f9bcbaa168cbea"} Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.883197 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerStarted","Data":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.883688 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.883797 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.884133 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.932609 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.935424 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:28 crc kubenswrapper[4895]: I0320 13:23:28.961724 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podStartSLOduration=58.961697433 podStartE2EDuration="58.961697433s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:28.961425917 +0000 UTC m=+108.471144903" watchObservedRunningTime="2026-03-20 13:23:28.961697433 +0000 UTC m=+108.471416419" Mar 20 13:23:29 crc kubenswrapper[4895]: I0320 13:23:29.892079 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krtrm" event={"ID":"a332b86a-c2f9-4702-9829-ba837dc7c404","Type":"ContainerStarted","Data":"5ebcd2691e03165dd0bd50e2cad57ba914e10018293166cf754d934181f54b3a"} Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.186021 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.186201 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.186259 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs podName:e899877b-fe80-4ace-9b35-41eb7302cf12 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.186239483 +0000 UTC m=+117.695958459 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs") pod "network-metrics-daemon-t9xh5" (UID: "e899877b-fe80-4ace-9b35-41eb7302cf12") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.211280 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.211337 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.211298 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.211411 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.211436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.211574 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.211680 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.211792 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.646327 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-krtrm" podStartSLOduration=61.646297257 podStartE2EDuration="1m1.646297257s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:29.91952148 +0000 UTC m=+109.429240506" watchObservedRunningTime="2026-03-20 13:23:30.646297257 +0000 UTC m=+110.156016263" Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.648539 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t9xh5"] Mar 20 13:23:30 crc kubenswrapper[4895]: I0320 13:23:30.894717 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:30 crc kubenswrapper[4895]: E0320 13:23:30.895153 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:32 crc kubenswrapper[4895]: I0320 13:23:32.211052 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:32 crc kubenswrapper[4895]: I0320 13:23:32.211119 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:32 crc kubenswrapper[4895]: I0320 13:23:32.211201 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:32 crc kubenswrapper[4895]: E0320 13:23:32.211430 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:32 crc kubenswrapper[4895]: E0320 13:23:32.211546 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:32 crc kubenswrapper[4895]: E0320 13:23:32.211654 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.210884 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:33 crc kubenswrapper[4895]: E0320 13:23:33.211092 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t9xh5" podUID="e899877b-fe80-4ace-9b35-41eb7302cf12" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.492593 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.492861 4895 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.547370 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.548100 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.548484 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.549478 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.550891 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.552505 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.555619 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.556474 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.562805 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.563261 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.563424 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.563605 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.563752 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.563933 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564022 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564231 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g8hxm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564245 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564903 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564952 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564962 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564432 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564577 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564633 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564787 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.564873 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.565291 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.565345 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.565580 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.565730 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.565783 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.566762 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.567385 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.567797 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.568612 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.568972 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.572918 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.583310 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8wls"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.583658 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.584349 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.611505 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.611806 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.611887 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.613175 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.616288 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mcm6r"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.617218 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.618037 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626469 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdb6fe4-677b-4a89-92b3-4a95114180c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626517 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-encryption-config\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626542 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdb6fe4-677b-4a89-92b3-4a95114180c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626575 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626597 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-serving-cert\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626618 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-client\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626680 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8b8q\" (UniqueName: \"kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626731 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cwb\" (UniqueName: \"kubernetes.io/projected/a2980104-0602-4c36-8c7b-3877a591bc14-kube-api-access-t4cwb\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626753 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2980104-0602-4c36-8c7b-3877a591bc14-audit-dir\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626796 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-audit-policies\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626840 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626869 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626900 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrw6w\" (UniqueName: \"kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626921 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgd9g\" (UniqueName: \"kubernetes.io/projected/7fdb6fe4-677b-4a89-92b3-4a95114180c7-kube-api-access-pgd9g\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626943 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626964 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.626985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.633525 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4pshc"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.635160 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qml88"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.636433 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbwjf"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.636957 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.637107 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.637235 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.638267 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.638870 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.639189 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nrtwd"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.639915 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.641102 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-px7gz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.641716 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.700373 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.700929 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.701368 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.701879 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702111 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702351 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702589 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702775 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702849 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.702957 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.703361 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.703475 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xxspw"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.703944 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.704089 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.704442 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.704805 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.705238 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.705434 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.705456 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.705735 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.721515 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.721755 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722059 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722213 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722243 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722249 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722342 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722359 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722560 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722607 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722746 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722836 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722925 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722912 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723000 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722343 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723098 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723166 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723307 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723738 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722841 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722744 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722564 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723106 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.723130 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.722807 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.724804 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.724939 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725074 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725159 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725296 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725420 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725495 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.725570 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727499 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-config\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrw6w\" (UniqueName: \"kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727596 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727613 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727630 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgd9g\" (UniqueName: \"kubernetes.io/projected/7fdb6fe4-677b-4a89-92b3-4a95114180c7-kube-api-access-pgd9g\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727647 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727663 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727701 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vf97\" (UniqueName: \"kubernetes.io/projected/4d6abede-a152-40a5-a419-838da91e1e7e-kube-api-access-4vf97\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727753 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-serving-cert\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727787 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228lv\" (UniqueName: \"kubernetes.io/projected/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-kube-api-access-228lv\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727803 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdb6fe4-677b-4a89-92b3-4a95114180c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727820 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-trusted-ca\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727838 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727876 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727890 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-encryption-config\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727905 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdb6fe4-677b-4a89-92b3-4a95114180c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727920 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prgk\" (UniqueName: \"kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.727964 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005b37f-581a-4651-9dcb-f16414503616-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728002 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-encryption-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728017 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1b23c96-a749-4ee0-b8d5-8619916be03a-serving-cert\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728034 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-client\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728051 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-config\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728064 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728084 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728122 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-serving-cert\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728151 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-service-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728168 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n6wh\" (UniqueName: \"kubernetes.io/projected/22aa23b6-96e1-49b3-bbb9-d414e27df43b-kube-api-access-2n6wh\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728187 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6abede-a152-40a5-a419-838da91e1e7e-serving-cert\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728201 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit-dir\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728229 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728244 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728272 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-auth-proxy-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-config\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhcf\" (UniqueName: \"kubernetes.io/projected/4005b37f-581a-4651-9dcb-f16414503616-kube-api-access-sxhcf\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-client\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728358 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728420 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-node-pullsecrets\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728450 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqqtr\" (UniqueName: \"kubernetes.io/projected/c1b23c96-a749-4ee0-b8d5-8619916be03a-kube-api-access-rqqtr\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728480 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-metrics-tls\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728496 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728518 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728564 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5rtl\" (UniqueName: \"kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728584 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b8q\" (UniqueName: \"kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728602 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpwgt\" (UniqueName: \"kubernetes.io/projected/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-kube-api-access-lpwgt\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728633 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cwb\" (UniqueName: \"kubernetes.io/projected/a2980104-0602-4c36-8c7b-3877a591bc14-kube-api-access-t4cwb\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728648 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2980104-0602-4c36-8c7b-3877a591bc14-audit-dir\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728662 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728677 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728694 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jl7k\" (UniqueName: \"kubernetes.io/projected/ed200aaa-4ed3-4e46-a934-9e97a94e0738-kube-api-access-4jl7k\") pod \"downloads-7954f5f757-mcm6r\" (UID: \"ed200aaa-4ed3-4e46-a934-9e97a94e0738\") " pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728709 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hc8d\" (UniqueName: \"kubernetes.io/projected/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-kube-api-access-9hc8d\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728739 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-images\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728753 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728781 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728796 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728814 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-audit-policies\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-image-import-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728846 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728863 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a8a5873-21a7-4e54-8492-a8b86a088023-machine-approver-tls\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728878 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728913 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h584j\" (UniqueName: \"kubernetes.io/projected/8a8a5873-21a7-4e54-8492-a8b86a088023-kube-api-access-h584j\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728929 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-serving-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.728998 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmljv\" (UniqueName: \"kubernetes.io/projected/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-kube-api-access-fmljv\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.745233 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2980104-0602-4c36-8c7b-3877a591bc14-audit-dir\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.752803 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.753373 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.753458 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.753564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.754096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-audit-policies\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.754124 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.754168 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdb6fe4-677b-4a89-92b3-4a95114180c7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.754216 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.757609 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.757878 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.757977 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758106 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758193 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758270 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758284 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758350 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758447 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758539 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758604 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758742 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758863 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758953 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758103 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-client\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.759240 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.759355 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758114 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758144 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.758753 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.760049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.760054 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2980104-0602-4c36-8c7b-3877a591bc14-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.760433 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.760739 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.760870 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.761063 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.762448 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.763589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.765977 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.759995 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.767575 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fgv7q"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.768178 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-frk59"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.769741 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.769829 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.770451 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.770855 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.772484 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.773287 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.773847 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.774938 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.775151 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.775519 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.775659 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fdb6fe4-677b-4a89-92b3-4a95114180c7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.775833 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.776815 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.781241 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.782075 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.782873 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.784744 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.786508 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-serving-cert\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.800115 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.800888 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.801417 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2980104-0602-4c36-8c7b-3877a591bc14-encryption-config\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.801677 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.804811 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.805284 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.805313 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.805284 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.804895 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.806792 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.806977 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.807098 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.809229 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.812630 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.812766 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.813325 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.813643 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.814243 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.816426 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c6qbf"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.817056 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.818496 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.819285 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.819529 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.820000 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.820890 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.821440 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.821945 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.822337 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.822868 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.823227 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.823758 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wvm6"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.824256 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.826932 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.827419 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.827576 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.828519 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.828667 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcm6r"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829778 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jl7k\" (UniqueName: \"kubernetes.io/projected/ed200aaa-4ed3-4e46-a934-9e97a94e0738-kube-api-access-4jl7k\") pod \"downloads-7954f5f757-mcm6r\" (UID: \"ed200aaa-4ed3-4e46-a934-9e97a94e0738\") " pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hc8d\" (UniqueName: \"kubernetes.io/projected/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-kube-api-access-9hc8d\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829872 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829905 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-images\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-image-import-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829951 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829976 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a8a5873-21a7-4e54-8492-a8b86a088023-machine-approver-tls\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829994 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.829997 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830019 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h584j\" (UniqueName: \"kubernetes.io/projected/8a8a5873-21a7-4e54-8492-a8b86a088023-kube-api-access-h584j\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-serving-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830080 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmljv\" (UniqueName: \"kubernetes.io/projected/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-kube-api-access-fmljv\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830116 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830150 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830167 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-config\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830191 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830208 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830224 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-serving-cert\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830243 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228lv\" (UniqueName: \"kubernetes.io/projected/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-kube-api-access-228lv\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830259 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vf97\" (UniqueName: \"kubernetes.io/projected/4d6abede-a152-40a5-a419-838da91e1e7e-kube-api-access-4vf97\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830277 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830306 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-trusted-ca\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830353 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830373 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830407 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830449 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prgk\" (UniqueName: \"kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830460 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qml88"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005b37f-581a-4651-9dcb-f16414503616-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830487 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-client\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830503 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-encryption-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1b23c96-a749-4ee0-b8d5-8619916be03a-serving-cert\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830551 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-config\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-service-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830585 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n6wh\" (UniqueName: \"kubernetes.io/projected/22aa23b6-96e1-49b3-bbb9-d414e27df43b-kube-api-access-2n6wh\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830621 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6abede-a152-40a5-a419-838da91e1e7e-serving-cert\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit-dir\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830655 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830672 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830741 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830787 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830811 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-auth-proxy-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830838 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxhcf\" (UniqueName: \"kubernetes.io/projected/4005b37f-581a-4651-9dcb-f16414503616-kube-api-access-sxhcf\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830863 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-config\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830938 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-node-pullsecrets\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830962 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqqtr\" (UniqueName: \"kubernetes.io/projected/c1b23c96-a749-4ee0-b8d5-8619916be03a-kube-api-access-rqqtr\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.830986 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-metrics-tls\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831007 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831029 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831051 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5rtl\" (UniqueName: \"kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831081 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpwgt\" (UniqueName: \"kubernetes.io/projected/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-kube-api-access-lpwgt\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831107 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.831407 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.832357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-config\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.832833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.833970 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.833988 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-auth-proxy-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.834357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.835377 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.835498 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-serving-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.836080 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.836085 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.837349 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.837669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-trusted-ca\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.838593 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.838732 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.838737 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-node-pullsecrets\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.839272 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b23c96-a749-4ee0-b8d5-8619916be03a-config\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.839336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.839363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.840013 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.840016 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.840363 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4pshc"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.840421 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-frk59"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.840889 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-service-ca-bundle\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.841541 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.841766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.842004 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6abede-a152-40a5-a419-838da91e1e7e-config\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.842201 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/22aa23b6-96e1-49b3-bbb9-d414e27df43b-image-import-ca\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.842353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.842923 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4005b37f-581a-4651-9dcb-f16414503616-images\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843251 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843344 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/22aa23b6-96e1-49b3-bbb9-d414e27df43b-audit-dir\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843520 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-serving-cert\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.843923 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1b23c96-a749-4ee0-b8d5-8619916be03a-serving-cert\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.844092 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.844135 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.844348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8a5873-21a7-4e54-8492-a8b86a088023-config\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.846668 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-px7gz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.848224 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xxspw"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.849703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8a8a5873-21a7-4e54-8492-a8b86a088023-machine-approver-tls\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.849729 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6abede-a152-40a5-a419-838da91e1e7e-serving-cert\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.849750 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.850414 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.850432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.850609 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005b37f-581a-4651-9dcb-f16414503616-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.850696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-metrics-tls\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.850797 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.851054 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-encryption-config\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.851085 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.852737 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.853141 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.851582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/22aa23b6-96e1-49b3-bbb9-d414e27df43b-etcd-client\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.855295 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.856318 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.856334 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.857204 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.858248 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9hvwh"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.860480 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hvtz7"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.864213 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbwjf"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.864258 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.864354 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.864910 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.869361 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nrtwd"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.870616 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8wls"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.872697 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.875562 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.876510 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.876738 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.882043 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-gm5sr"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.884058 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wvm6"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.886179 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.891444 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.891881 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.894368 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.896448 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.898690 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9hvwh"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.899581 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.901251 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g8hxm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.902674 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.903658 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c6qbf"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.904956 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.906078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.906958 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.908290 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fpmdm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.909130 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.909336 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fpmdm"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.909741 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.912074 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7dpsb"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.912806 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.913588 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7dpsb"] Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.933616 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.949929 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.969534 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:23:33 crc kubenswrapper[4895]: I0320 13:23:33.990484 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.026097 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrw6w\" (UniqueName: \"kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w\") pod \"route-controller-manager-6576b87f9c-xc2mg\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.054925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgd9g\" (UniqueName: \"kubernetes.io/projected/7fdb6fe4-677b-4a89-92b3-4a95114180c7-kube-api-access-pgd9g\") pod \"openshift-apiserver-operator-796bbdcf4f-q9wq4\" (UID: \"7fdb6fe4-677b-4a89-92b3-4a95114180c7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.087208 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8b8q\" (UniqueName: \"kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q\") pod \"controller-manager-879f6c89f-2fv7p\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.107282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cwb\" (UniqueName: \"kubernetes.io/projected/a2980104-0602-4c36-8c7b-3877a591bc14-kube-api-access-t4cwb\") pod \"apiserver-7bbb656c7d-wchf2\" (UID: \"a2980104-0602-4c36-8c7b-3877a591bc14\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.111728 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.130341 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.150156 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.170174 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.191601 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.211516 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.211559 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.211608 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.211506 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.221522 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.230883 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.247931 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.250721 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.256713 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.270853 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.291168 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.310520 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.332886 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.352663 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.372385 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.398993 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.410675 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.430532 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.450859 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.471056 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.492692 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.510542 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.519181 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:34 crc kubenswrapper[4895]: W0320 13:23:34.525077 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d87434_561d_4397_94b6_1a96d6286361.slice/crio-e986acd6f7a7221c71b56b236989170debeae86a33248d115b049afff9b8cc2f WatchSource:0}: Error finding container e986acd6f7a7221c71b56b236989170debeae86a33248d115b049afff9b8cc2f: Status 404 returned error can't find the container with id e986acd6f7a7221c71b56b236989170debeae86a33248d115b049afff9b8cc2f Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.530086 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.554334 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.570581 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.590474 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.610648 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.631295 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.650268 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.670568 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.678329 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.682207 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4"] Mar 20 13:23:34 crc kubenswrapper[4895]: W0320 13:23:34.688454 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaccdc13f_9bde_41cd_8caf_e1aafdf7d913.slice/crio-23531465a81f0e097f83dfe5799d4c2f88a71a583c045b311ca5e9726877e85d WatchSource:0}: Error finding container 23531465a81f0e097f83dfe5799d4c2f88a71a583c045b311ca5e9726877e85d: Status 404 returned error can't find the container with id 23531465a81f0e097f83dfe5799d4c2f88a71a583c045b311ca5e9726877e85d Mar 20 13:23:34 crc kubenswrapper[4895]: W0320 13:23:34.690205 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdb6fe4_677b_4a89_92b3_4a95114180c7.slice/crio-0e750bae59cc9c2d798ee0af746abb93a636ad4db3d4504aa2146acf4cf0bec7 WatchSource:0}: Error finding container 0e750bae59cc9c2d798ee0af746abb93a636ad4db3d4504aa2146acf4cf0bec7: Status 404 returned error can't find the container with id 0e750bae59cc9c2d798ee0af746abb93a636ad4db3d4504aa2146acf4cf0bec7 Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.690231 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.711145 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.728948 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2"] Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.730565 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:23:34 crc kubenswrapper[4895]: W0320 13:23:34.741973 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2980104_0602_4c36_8c7b_3877a591bc14.slice/crio-b80804c278aa781317cf762278cecc429539d090a3e15c9c5b9c76dbad77c298 WatchSource:0}: Error finding container b80804c278aa781317cf762278cecc429539d090a3e15c9c5b9c76dbad77c298: Status 404 returned error can't find the container with id b80804c278aa781317cf762278cecc429539d090a3e15c9c5b9c76dbad77c298 Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.751013 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.770493 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.791047 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.810639 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.829577 4895 request.go:700] Waited for 1.015117549s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.839265 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.850205 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.870236 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.890638 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.908059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" event={"ID":"a2980104-0602-4c36-8c7b-3877a591bc14","Type":"ContainerStarted","Data":"b80804c278aa781317cf762278cecc429539d090a3e15c9c5b9c76dbad77c298"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.910474 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" event={"ID":"7fdb6fe4-677b-4a89-92b3-4a95114180c7","Type":"ContainerStarted","Data":"69767d1a872bc78f0031214a3dc8808c90896e7a93f9306ed5c3a7217a4bc37c"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.910499 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" event={"ID":"7fdb6fe4-677b-4a89-92b3-4a95114180c7","Type":"ContainerStarted","Data":"0e750bae59cc9c2d798ee0af746abb93a636ad4db3d4504aa2146acf4cf0bec7"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.910885 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.912816 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" event={"ID":"accdc13f-9bde-41cd-8caf-e1aafdf7d913","Type":"ContainerStarted","Data":"78aff67b89b646349bb8d0f231a6a873784529221d5a59a18d1d1f0991fc987d"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.912848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" event={"ID":"accdc13f-9bde-41cd-8caf-e1aafdf7d913","Type":"ContainerStarted","Data":"23531465a81f0e097f83dfe5799d4c2f88a71a583c045b311ca5e9726877e85d"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.913538 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.914542 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" event={"ID":"33d87434-561d-4397-94b6-1a96d6286361","Type":"ContainerStarted","Data":"835f7b9ecf53abfe53a978d19f4ade769967324058c227d6c8bdc69877d9076d"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.914575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" event={"ID":"33d87434-561d-4397-94b6-1a96d6286361","Type":"ContainerStarted","Data":"e986acd6f7a7221c71b56b236989170debeae86a33248d115b049afff9b8cc2f"} Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.914827 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.915151 4895 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-2fv7p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.915183 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.915828 4895 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-xc2mg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.916127 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" podUID="33d87434-561d-4397-94b6-1a96d6286361" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.930021 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.950649 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.970608 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:23:34 crc kubenswrapper[4895]: I0320 13:23:34.990800 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.011030 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.030074 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.051225 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.071140 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.091502 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.111333 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.131970 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.151502 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.171316 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.190961 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.211125 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.212124 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.213170 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.232967 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.251439 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.270912 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.291804 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.311077 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.330458 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.371263 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.380277 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jl7k\" (UniqueName: \"kubernetes.io/projected/ed200aaa-4ed3-4e46-a934-9e97a94e0738-kube-api-access-4jl7k\") pod \"downloads-7954f5f757-mcm6r\" (UID: \"ed200aaa-4ed3-4e46-a934-9e97a94e0738\") " pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.391655 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.411937 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.430986 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.474870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hc8d\" (UniqueName: \"kubernetes.io/projected/2a86aab5-8a4e-44e0-a4be-e5f05857e3ca-kube-api-access-9hc8d\") pod \"cluster-samples-operator-665b6dd947-gtgnx\" (UID: \"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.487803 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.489953 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h584j\" (UniqueName: \"kubernetes.io/projected/8a8a5873-21a7-4e54-8492-a8b86a088023-kube-api-access-h584j\") pod \"machine-approver-56656f9798-9czxh\" (UID: \"8a8a5873-21a7-4e54-8492-a8b86a088023\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.496552 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.511372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmljv\" (UniqueName: \"kubernetes.io/projected/a3ba85f8-49ce-44f5-8a95-914fbcca3a8e-kube-api-access-fmljv\") pod \"openshift-config-operator-7777fb866f-qml88\" (UID: \"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:35 crc kubenswrapper[4895]: W0320 13:23:35.517609 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8a5873_21a7_4e54_8492_a8b86a088023.slice/crio-70d93a336c030f1e780158d51e4e127d099d3af3c768a0077a33beeb38f9e04f WatchSource:0}: Error finding container 70d93a336c030f1e780158d51e4e127d099d3af3c768a0077a33beeb38f9e04f: Status 404 returned error can't find the container with id 70d93a336c030f1e780158d51e4e127d099d3af3c768a0077a33beeb38f9e04f Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.523809 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.530635 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228lv\" (UniqueName: \"kubernetes.io/projected/bcebbd24-997c-44b3-bc3c-cedb2caf8e89-kube-api-access-228lv\") pod \"dns-operator-744455d44c-nrtwd\" (UID: \"bcebbd24-997c-44b3-bc3c-cedb2caf8e89\") " pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.550273 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vf97\" (UniqueName: \"kubernetes.io/projected/4d6abede-a152-40a5-a419-838da91e1e7e-kube-api-access-4vf97\") pod \"authentication-operator-69f744f599-4pshc\" (UID: \"4d6abede-a152-40a5-a419-838da91e1e7e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.554303 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.569701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxhcf\" (UniqueName: \"kubernetes.io/projected/4005b37f-581a-4651-9dcb-f16414503616-kube-api-access-sxhcf\") pod \"machine-api-operator-5694c8668f-px7gz\" (UID: \"4005b37f-581a-4651-9dcb-f16414503616\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.593833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqqtr\" (UniqueName: \"kubernetes.io/projected/c1b23c96-a749-4ee0-b8d5-8619916be03a-kube-api-access-rqqtr\") pod \"console-operator-58897d9998-vbwjf\" (UID: \"c1b23c96-a749-4ee0-b8d5-8619916be03a\") " pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.606668 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.614064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5rtl\" (UniqueName: \"kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl\") pod \"oauth-openshift-558db77b4-q8wls\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.626782 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.638827 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpwgt\" (UniqueName: \"kubernetes.io/projected/a55b1e0b-6071-4fcf-8ca4-f9931fab9b17-kube-api-access-lpwgt\") pod \"openshift-controller-manager-operator-756b6f6bc6-psd4t\" (UID: \"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.650302 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.657434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prgk\" (UniqueName: \"kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk\") pod \"console-f9d7485db-wrj6w\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.667823 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n6wh\" (UniqueName: \"kubernetes.io/projected/22aa23b6-96e1-49b3-bbb9-d414e27df43b-kube-api-access-2n6wh\") pod \"apiserver-76f77b778f-g8hxm\" (UID: \"22aa23b6-96e1-49b3-bbb9-d414e27df43b\") " pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.670038 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.672897 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.694530 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.706293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.712855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.734531 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.753124 4895 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.756759 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx"] Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.768311 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.770063 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.792024 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.806304 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.808063 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcm6r"] Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.812135 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.812198 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.831593 4895 request.go:700] Waited for 1.921990002s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.834176 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.849104 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4pshc"] Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.852758 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.873258 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.894946 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.910088 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.926150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcm6r" event={"ID":"ed200aaa-4ed3-4e46-a934-9e97a94e0738","Type":"ContainerStarted","Data":"ba4b2b809f97ebf3968d40fd76e2a4286c4eb78a6cb75b4bf07c2e6854e59d8a"} Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.927329 4895 generic.go:334] "Generic (PLEG): container finished" podID="a2980104-0602-4c36-8c7b-3877a591bc14" containerID="0b96f36d85d7ab12f4d0433c1c898ba3e3456f347cd5213a639d6e9f0e6caf06" exitCode=0 Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.927611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" event={"ID":"a2980104-0602-4c36-8c7b-3877a591bc14","Type":"ContainerDied","Data":"0b96f36d85d7ab12f4d0433c1c898ba3e3456f347cd5213a639d6e9f0e6caf06"} Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.934720 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.938792 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" event={"ID":"8a8a5873-21a7-4e54-8492-a8b86a088023","Type":"ContainerStarted","Data":"70d93a336c030f1e780158d51e4e127d099d3af3c768a0077a33beeb38f9e04f"} Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.943323 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.947593 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57"} Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.948435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.951805 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" event={"ID":"4d6abede-a152-40a5-a419-838da91e1e7e","Type":"ContainerStarted","Data":"5b44743064b54443490154f6f36757bb2cd6e6823f4db83c812e160e9f598763"} Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.952609 4895 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-2fv7p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.952643 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.985310 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:35 crc kubenswrapper[4895]: I0320 13:23:35.990414 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.011683 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.031744 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.050860 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.057345 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.067925 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.067962 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff7fe59-fc6e-4a7f-826f-71f45302c074-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.067982 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c06369a-5668-4982-aaba-acc71c4c4ce1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068017 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c06369a-5668-4982-aaba-acc71c4c4ce1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068033 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068048 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-serving-cert\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068119 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068137 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-config\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068204 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068236 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-service-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068303 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068320 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068337 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ff7fe59-fc6e-4a7f-826f-71f45302c074-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068361 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt26\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068414 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv64\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-kube-api-access-6xv64\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068474 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068492 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk7gz\" (UniqueName: \"kubernetes.io/projected/4db586ef-5bd4-45b9-af5d-825ac88a79e2-kube-api-access-tk7gz\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068564 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-client\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.068580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c06369a-5668-4982-aaba-acc71c4c4ce1-config\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.070550 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.570538713 +0000 UTC m=+116.080257679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.072792 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.090366 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:23:36 crc kubenswrapper[4895]: W0320 13:23:36.123582 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda55b1e0b_6071_4fcf_8ca4_f9931fab9b17.slice/crio-4d17adadd54264259906a802815b4c8b61f3f19a02fed1791b28f9f43343ae40 WatchSource:0}: Error finding container 4d17adadd54264259906a802815b4c8b61f3f19a02fed1791b28f9f43343ae40: Status 404 returned error can't find the container with id 4d17adadd54264259906a802815b4c8b61f3f19a02fed1791b28f9f43343ae40 Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.159862 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qml88"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171323 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171501 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-mountpoint-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171526 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171542 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-registration-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdvl\" (UniqueName: \"kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171590 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171608 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-apiservice-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-proxy-tls\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cedee9dd-d9ad-43b7-97aa-f709c9a59604-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171657 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27950301-9838-48a8-a0d9-83a4083d2e0d-cert\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171673 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt26\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171691 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-default-certificate\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171707 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhcfm\" (UniqueName: \"kubernetes.io/projected/210eb590-1075-4279-856f-0899b35e0021-kube-api-access-vhcfm\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171723 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-srv-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171741 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv64\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-kube-api-access-6xv64\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171759 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171785 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxh9\" (UniqueName: \"kubernetes.io/projected/617be708-08ca-4534-ae2d-2ae747070e51-kube-api-access-grxh9\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171805 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk7gz\" (UniqueName: \"kubernetes.io/projected/4db586ef-5bd4-45b9-af5d-825ac88a79e2-kube-api-access-tk7gz\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171822 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171837 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0003f4f-09f6-47f2-85bb-21cac67d07a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e827db1-1343-4055-9fb1-739b183fbf0a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171868 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-client\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171885 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwl78\" (UniqueName: \"kubernetes.io/projected/27950301-9838-48a8-a0d9-83a4083d2e0d-kube-api-access-rwl78\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171902 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c06369a-5668-4982-aaba-acc71c4c4ce1-config\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171917 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-metrics-certs\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171933 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-certs\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171947 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54f79e52-3f56-4f2b-a0b1-5a40f029633b-metrics-tls\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171978 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.171996 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c06369a-5668-4982-aaba-acc71c4c4ce1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172014 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff7fe59-fc6e-4a7f-826f-71f45302c074-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172038 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0003f4f-09f6-47f2-85bb-21cac67d07a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172069 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e827db1-1343-4055-9fb1-739b183fbf0a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172104 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrj6\" (UniqueName: \"kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-serving-cert\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172149 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cedee9dd-d9ad-43b7-97aa-f709c9a59604-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172165 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvxhn\" (UniqueName: \"kubernetes.io/projected/a99cb22b-8c01-4a64-b512-8e4f61fb0558-kube-api-access-mvxhn\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172180 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8676ece2-b364-4b7e-a585-7aa514beb173-serving-cert\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172196 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172210 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49v4\" (UniqueName: \"kubernetes.io/projected/2fe0311f-3e9d-4749-b06c-a28d7d889c45-kube-api-access-x49v4\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172226 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172248 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ea384-24f2-4134-9f53-3f6764f1a0d2-trusted-ca\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-cabundle\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172279 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-plugins-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172295 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-service-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172349 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgtm\" (UniqueName: \"kubernetes.io/projected/fec1c815-e7e7-4367-b126-66d1ff806bf7-kube-api-access-ctgtm\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172367 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-key\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/210eb590-1075-4279-856f-0899b35e0021-proxy-tls\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172937 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ff7fe59-fc6e-4a7f-826f-71f45302c074-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172954 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b79ea384-24f2-4134-9f53-3f6764f1a0d2-metrics-tls\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54f79e52-3f56-4f2b-a0b1-5a40f029633b-config-volume\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.172987 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgsv\" (UniqueName: \"kubernetes.io/projected/3585ee2c-5c20-4a41-8909-1363e4554ccf-kube-api-access-szgsv\") pod \"migrator-59844c95c7-2mx5m\" (UID: \"3585ee2c-5c20-4a41-8909-1363e4554ccf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173003 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/210eb590-1075-4279-856f-0899b35e0021-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173033 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-csi-data-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173067 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/357dbb1d-53bb-4dc2-9645-377146fed802-tmpfs\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173082 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thngc\" (UniqueName: \"kubernetes.io/projected/c0003f4f-09f6-47f2-85bb-21cac67d07a5-kube-api-access-thngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173097 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blbwn\" (UniqueName: \"kubernetes.io/projected/8676ece2-b364-4b7e-a585-7aa514beb173-kube-api-access-blbwn\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75zjr\" (UniqueName: \"kubernetes.io/projected/bc0a8d83-a2d4-4231-a024-85e6cf31955c-kube-api-access-75zjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173141 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e827db1-1343-4055-9fb1-739b183fbf0a-config\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cedee9dd-d9ad-43b7-97aa-f709c9a59604-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173181 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173196 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173221 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mp5\" (UniqueName: \"kubernetes.io/projected/6ffc6194-dadb-4607-a3c0-08dbdff1d476-kube-api-access-l6mp5\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173237 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/617be708-08ca-4534-ae2d-2ae747070e51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173279 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbbwx\" (UniqueName: \"kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5k2\" (UniqueName: \"kubernetes.io/projected/3b742e38-4844-4cd1-8523-9dd476bf87fa-kube-api-access-pj5k2\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9sq\" (UniqueName: \"kubernetes.io/projected/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-kube-api-access-5p9sq\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173369 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/259c146d-1ce3-4a6d-bf50-36315be6efae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173402 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8676ece2-b364-4b7e-a585-7aa514beb173-config\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173420 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-node-bootstrap-token\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173436 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6zpf\" (UniqueName: \"kubernetes.io/projected/259c146d-1ce3-4a6d-bf50-36315be6efae-kube-api-access-g6zpf\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173452 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173471 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c06369a-5668-4982-aaba-acc71c4c4ce1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173635 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc0a8d83-a2d4-4231-a024-85e6cf31955c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173656 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-images\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvck\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-kube-api-access-4zvck\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173698 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-webhook-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173714 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqcp2\" (UniqueName: \"kubernetes.io/projected/357dbb1d-53bb-4dc2-9645-377146fed802-kube-api-access-fqcp2\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173732 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.173747 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/54f79e52-3f56-4f2b-a0b1-5a40f029633b-kube-api-access-5cbx6\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.180227 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbwjf"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181135 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181302 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-config\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181334 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-srv-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-stats-auth\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181512 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181544 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181562 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndpz\" (UniqueName: \"kubernetes.io/projected/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-kube-api-access-qndpz\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-socket-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.181604 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe0311f-3e9d-4749-b06c-a28d7d889c45-service-ca-bundle\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.183869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-service-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.184361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-serving-cert\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.186720 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.197819 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-ca\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.198326 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.201445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ff7fe59-fc6e-4a7f-826f-71f45302c074-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.204639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db586ef-5bd4-45b9-af5d-825ac88a79e2-config\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.213291 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.214733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c06369a-5668-4982-aaba-acc71c4c4ce1-config\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.216860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4db586ef-5bd4-45b9-af5d-825ac88a79e2-etcd-client\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.216995 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.716962878 +0000 UTC m=+116.226681834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.221633 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ff7fe59-fc6e-4a7f-826f-71f45302c074-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.222006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.241512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c06369a-5668-4982-aaba-acc71c4c4ce1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.247469 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv64\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-kube-api-access-6xv64\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.247748 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt26\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: W0320 13:23:36.248413 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ba85f8_49ce_44f5_8a95_914fbcca3a8e.slice/crio-7c16d94ab1cba4b0eeb3c449fba7ca5ba393fe41dd98d878d0be6988611e1444 WatchSource:0}: Error finding container 7c16d94ab1cba4b0eeb3c449fba7ca5ba393fe41dd98d878d0be6988611e1444: Status 404 returned error can't find the container with id 7c16d94ab1cba4b0eeb3c449fba7ca5ba393fe41dd98d878d0be6988611e1444 Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289476 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p9sq\" (UniqueName: \"kubernetes.io/projected/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-kube-api-access-5p9sq\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289562 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/259c146d-1ce3-4a6d-bf50-36315be6efae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289587 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-node-bootstrap-token\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289605 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8676ece2-b364-4b7e-a585-7aa514beb173-config\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289647 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6zpf\" (UniqueName: \"kubernetes.io/projected/259c146d-1ce3-4a6d-bf50-36315be6efae-kube-api-access-g6zpf\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289684 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc0a8d83-a2d4-4231-a024-85e6cf31955c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289723 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-images\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289750 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvck\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-kube-api-access-4zvck\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289796 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcp2\" (UniqueName: \"kubernetes.io/projected/357dbb1d-53bb-4dc2-9645-377146fed802-kube-api-access-fqcp2\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289815 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-webhook-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289839 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289907 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/54f79e52-3f56-4f2b-a0b1-5a40f029633b-kube-api-access-5cbx6\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289960 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-srv-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.289984 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-stats-auth\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290006 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290045 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290061 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe0311f-3e9d-4749-b06c-a28d7d889c45-service-ca-bundle\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndpz\" (UniqueName: \"kubernetes.io/projected/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-kube-api-access-qndpz\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290113 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-socket-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290140 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290156 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-mountpoint-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290221 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdvl\" (UniqueName: \"kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290272 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-registration-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290347 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-apiservice-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-proxy-tls\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290427 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cedee9dd-d9ad-43b7-97aa-f709c9a59604-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290450 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27950301-9838-48a8-a0d9-83a4083d2e0d-cert\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290496 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-default-certificate\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290521 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcfm\" (UniqueName: \"kubernetes.io/projected/210eb590-1075-4279-856f-0899b35e0021-kube-api-access-vhcfm\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-srv-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290591 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290619 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grxh9\" (UniqueName: \"kubernetes.io/projected/617be708-08ca-4534-ae2d-2ae747070e51-kube-api-access-grxh9\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0003f4f-09f6-47f2-85bb-21cac67d07a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290702 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e827db1-1343-4055-9fb1-739b183fbf0a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290741 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-metrics-certs\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290757 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwl78\" (UniqueName: \"kubernetes.io/projected/27950301-9838-48a8-a0d9-83a4083d2e0d-kube-api-access-rwl78\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-certs\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.290788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54f79e52-3f56-4f2b-a0b1-5a40f029633b-metrics-tls\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291245 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291297 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0003f4f-09f6-47f2-85bb-21cac67d07a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e827db1-1343-4055-9fb1-739b183fbf0a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291360 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrj6\" (UniqueName: \"kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291380 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cedee9dd-d9ad-43b7-97aa-f709c9a59604-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291434 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvxhn\" (UniqueName: \"kubernetes.io/projected/a99cb22b-8c01-4a64-b512-8e4f61fb0558-kube-api-access-mvxhn\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291453 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x49v4\" (UniqueName: \"kubernetes.io/projected/2fe0311f-3e9d-4749-b06c-a28d7d889c45-kube-api-access-x49v4\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8676ece2-b364-4b7e-a585-7aa514beb173-serving-cert\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291483 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ea384-24f2-4134-9f53-3f6764f1a0d2-trusted-ca\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291937 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-cabundle\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-plugins-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291987 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgtm\" (UniqueName: \"kubernetes.io/projected/fec1c815-e7e7-4367-b126-66d1ff806bf7-kube-api-access-ctgtm\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292022 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-key\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292039 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/210eb590-1075-4279-856f-0899b35e0021-proxy-tls\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292062 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b79ea384-24f2-4134-9f53-3f6764f1a0d2-metrics-tls\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54f79e52-3f56-4f2b-a0b1-5a40f029633b-config-volume\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292129 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szgsv\" (UniqueName: \"kubernetes.io/projected/3585ee2c-5c20-4a41-8909-1363e4554ccf-kube-api-access-szgsv\") pod \"migrator-59844c95c7-2mx5m\" (UID: \"3585ee2c-5c20-4a41-8909-1363e4554ccf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/210eb590-1075-4279-856f-0899b35e0021-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292204 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-csi-data-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292268 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/357dbb1d-53bb-4dc2-9645-377146fed802-tmpfs\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292285 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thngc\" (UniqueName: \"kubernetes.io/projected/c0003f4f-09f6-47f2-85bb-21cac67d07a5-kube-api-access-thngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292305 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blbwn\" (UniqueName: \"kubernetes.io/projected/8676ece2-b364-4b7e-a585-7aa514beb173-kube-api-access-blbwn\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292350 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75zjr\" (UniqueName: \"kubernetes.io/projected/bc0a8d83-a2d4-4231-a024-85e6cf31955c-kube-api-access-75zjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292367 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e827db1-1343-4055-9fb1-739b183fbf0a-config\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292383 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cedee9dd-d9ad-43b7-97aa-f709c9a59604-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292450 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/617be708-08ca-4534-ae2d-2ae747070e51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292511 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mp5\" (UniqueName: \"kubernetes.io/projected/6ffc6194-dadb-4607-a3c0-08dbdff1d476-kube-api-access-l6mp5\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292529 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbbwx\" (UniqueName: \"kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.292546 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5k2\" (UniqueName: \"kubernetes.io/projected/3b742e38-4844-4cd1-8523-9dd476bf87fa-kube-api-access-pj5k2\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.293456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.294894 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0003f4f-09f6-47f2-85bb-21cac67d07a5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.295433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.296979 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ea384-24f2-4134-9f53-3f6764f1a0d2-trusted-ca\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.291243 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk7gz\" (UniqueName: \"kubernetes.io/projected/4db586ef-5bd4-45b9-af5d-825ac88a79e2-kube-api-access-tk7gz\") pod \"etcd-operator-b45778765-xxspw\" (UID: \"4db586ef-5bd4-45b9-af5d-825ac88a79e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.299516 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8676ece2-b364-4b7e-a585-7aa514beb173-serving-cert\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.299806 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-cabundle\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.299891 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-plugins-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.301148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-profile-collector-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.302361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe0311f-3e9d-4749-b06c-a28d7d889c45-service-ca-bundle\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.302436 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nrtwd"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.302522 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-socket-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.303090 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.303142 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-mountpoint-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.303582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-registration-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.303608 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.303952 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-node-bootstrap-token\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.304312 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c06369a-5668-4982-aaba-acc71c4c4ce1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-klgkl\" (UID: \"1c06369a-5668-4982-aaba-acc71c4c4ce1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.304450 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8676ece2-b364-4b7e-a585-7aa514beb173-config\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.304586 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.305806 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.306381 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fec1c815-e7e7-4367-b126-66d1ff806bf7-srv-cert\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.307234 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.308321 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-images\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.309505 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.309720 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/259c146d-1ce3-4a6d-bf50-36315be6efae-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.310221 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cedee9dd-d9ad-43b7-97aa-f709c9a59604-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.310361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a99cb22b-8c01-4a64-b512-8e4f61fb0558-csi-data-dir\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.310561 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.810543947 +0000 UTC m=+116.320262913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.312702 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-webhook-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.315071 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b742e38-4844-4cd1-8523-9dd476bf87fa-certs\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.315621 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-metrics-certs\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.316171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/210eb590-1075-4279-856f-0899b35e0021-proxy-tls\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.316443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.318821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-proxy-tls\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.318828 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54f79e52-3f56-4f2b-a0b1-5a40f029633b-config-volume\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.319216 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b79ea384-24f2-4134-9f53-3f6764f1a0d2-metrics-tls\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.319328 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.319579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/357dbb1d-53bb-4dc2-9645-377146fed802-tmpfs\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.319674 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0003f4f-09f6-47f2-85bb-21cac67d07a5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.319914 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54f79e52-3f56-4f2b-a0b1-5a40f029633b-metrics-tls\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.320758 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc0a8d83-a2d4-4231-a024-85e6cf31955c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.322036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cedee9dd-d9ad-43b7-97aa-f709c9a59604-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.322421 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/210eb590-1075-4279-856f-0899b35e0021-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.322847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27950301-9838-48a8-a0d9-83a4083d2e0d-cert\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.324042 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.324432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/617be708-08ca-4534-ae2d-2ae747070e51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.325624 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ffc6194-dadb-4607-a3c0-08dbdff1d476-signing-key\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.328581 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/357dbb1d-53bb-4dc2-9645-377146fed802-apiservice-cert\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.331067 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-default-certificate\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.333026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e827db1-1343-4055-9fb1-739b183fbf0a-config\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.336050 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2fe0311f-3e9d-4749-b06c-a28d7d889c45-stats-auth\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.341830 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-srv-cert\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.347020 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-px7gz"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.347374 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g8hxm"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.348336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e827db1-1343-4055-9fb1-739b183fbf0a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.349545 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ff7fe59-fc6e-4a7f-826f-71f45302c074-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z72zk\" (UID: \"6ff7fe59-fc6e-4a7f-826f-71f45302c074\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.350963 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.378606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvxhn\" (UniqueName: \"kubernetes.io/projected/a99cb22b-8c01-4a64-b512-8e4f61fb0558-kube-api-access-mvxhn\") pod \"csi-hostpathplugin-9hvwh\" (UID: \"a99cb22b-8c01-4a64-b512-8e4f61fb0558\") " pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: W0320 13:23:36.382920 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcebbd24_997c_44b3_bc3c_cedb2caf8e89.slice/crio-aea80132358b64e4a9fd525d0a9b5c8796c180385d566a31e549f997ffd7b4ad WatchSource:0}: Error finding container aea80132358b64e4a9fd525d0a9b5c8796c180385d566a31e549f997ffd7b4ad: Status 404 returned error can't find the container with id aea80132358b64e4a9fd525d0a9b5c8796c180385d566a31e549f997ffd7b4ad Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.390202 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhcfm\" (UniqueName: \"kubernetes.io/projected/210eb590-1075-4279-856f-0899b35e0021-kube-api-access-vhcfm\") pod \"machine-config-controller-84d6567774-dnv8x\" (UID: \"210eb590-1075-4279-856f-0899b35e0021\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.393105 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.893088877 +0000 UTC m=+116.402807843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.393049 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.391370 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8wls"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.393334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.393855 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.893844303 +0000 UTC m=+116.403563269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.415482 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.415846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p9sq\" (UniqueName: \"kubernetes.io/projected/6dda9b81-daca-4bc0-83e4-bad3b0f20dfc-kube-api-access-5p9sq\") pod \"olm-operator-6b444d44fb-kdh8v\" (UID: \"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.427532 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5k2\" (UniqueName: \"kubernetes.io/projected/3b742e38-4844-4cd1-8523-9dd476bf87fa-kube-api-access-pj5k2\") pod \"machine-config-server-hvtz7\" (UID: \"3b742e38-4844-4cd1-8523-9dd476bf87fa\") " pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: W0320 13:23:36.429027 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a35300_9f9f_44c7_a1ff_d818032e001a.slice/crio-fe5425d1d0ad848e1de834fdf844787c4268cbf06fd63aac0aeb5170cd345cfc WatchSource:0}: Error finding container fe5425d1d0ad848e1de834fdf844787c4268cbf06fd63aac0aeb5170cd345cfc: Status 404 returned error can't find the container with id fe5425d1d0ad848e1de834fdf844787c4268cbf06fd63aac0aeb5170cd345cfc Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.448199 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.465173 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49v4\" (UniqueName: \"kubernetes.io/projected/2fe0311f-3e9d-4749-b06c-a28d7d889c45-kube-api-access-x49v4\") pod \"router-default-5444994796-fgv7q\" (UID: \"2fe0311f-3e9d-4749-b06c-a28d7d889c45\") " pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.480195 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e827db1-1343-4055-9fb1-739b183fbf0a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5kx2j\" (UID: \"0e827db1-1343-4055-9fb1-739b183fbf0a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.494560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.495297 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:36.995277723 +0000 UTC m=+116.504996689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.511413 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hvtz7" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.522031 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdvl\" (UniqueName: \"kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl\") pod \"collect-profiles-29566875-cvq2h\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.525070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxh9\" (UniqueName: \"kubernetes.io/projected/617be708-08ca-4534-ae2d-2ae747070e51-kube-api-access-grxh9\") pod \"multus-admission-controller-857f4d67dd-c6qbf\" (UID: \"617be708-08ca-4534-ae2d-2ae747070e51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.528739 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.533020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgtm\" (UniqueName: \"kubernetes.io/projected/fec1c815-e7e7-4367-b126-66d1ff806bf7-kube-api-access-ctgtm\") pod \"catalog-operator-68c6474976-lj4nt\" (UID: \"fec1c815-e7e7-4367-b126-66d1ff806bf7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.551900 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndpz\" (UniqueName: \"kubernetes.io/projected/0a0750f5-6779-49ca-ac7e-2b24526fbd5d-kube-api-access-qndpz\") pod \"machine-config-operator-74547568cd-hjc74\" (UID: \"0a0750f5-6779-49ca-ac7e-2b24526fbd5d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.576348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwl78\" (UniqueName: \"kubernetes.io/projected/27950301-9838-48a8-a0d9-83a4083d2e0d-kube-api-access-rwl78\") pod \"ingress-canary-fpmdm\" (UID: \"27950301-9838-48a8-a0d9-83a4083d2e0d\") " pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.587982 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.595137 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6zpf\" (UniqueName: \"kubernetes.io/projected/259c146d-1ce3-4a6d-bf50-36315be6efae-kube-api-access-g6zpf\") pod \"package-server-manager-789f6589d5-9hxxm\" (UID: \"259c146d-1ce3-4a6d-bf50-36315be6efae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.595969 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.596282 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.096269042 +0000 UTC m=+116.605988008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.624809 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cedee9dd-d9ad-43b7-97aa-f709c9a59604-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zc2zz\" (UID: \"cedee9dd-d9ad-43b7-97aa-f709c9a59604\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.634706 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.650230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thngc\" (UniqueName: \"kubernetes.io/projected/c0003f4f-09f6-47f2-85bb-21cac67d07a5-kube-api-access-thngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-nzwd4\" (UID: \"c0003f4f-09f6-47f2-85bb-21cac67d07a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.651294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.652846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blbwn\" (UniqueName: \"kubernetes.io/projected/8676ece2-b364-4b7e-a585-7aa514beb173-kube-api-access-blbwn\") pod \"service-ca-operator-777779d784-7fjj4\" (UID: \"8676ece2-b364-4b7e-a585-7aa514beb173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.658783 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.673244 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.679850 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75zjr\" (UniqueName: \"kubernetes.io/projected/bc0a8d83-a2d4-4231-a024-85e6cf31955c-kube-api-access-75zjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-2dq2z\" (UID: \"bc0a8d83-a2d4-4231-a024-85e6cf31955c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.706768 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.707030 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.707349 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.20732807 +0000 UTC m=+116.717047036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.707445 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.707856 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.207849372 +0000 UTC m=+116.717568338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.708155 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.718522 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbx6\" (UniqueName: \"kubernetes.io/projected/54f79e52-3f56-4f2b-a0b1-5a40f029633b-kube-api-access-5cbx6\") pod \"dns-default-7dpsb\" (UID: \"54f79e52-3f56-4f2b-a0b1-5a40f029633b\") " pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.729367 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.744020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.744076 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvck\" (UniqueName: \"kubernetes.io/projected/b79ea384-24f2-4134-9f53-3f6764f1a0d2-kube-api-access-4zvck\") pod \"ingress-operator-5b745b69d9-frk59\" (UID: \"b79ea384-24f2-4134-9f53-3f6764f1a0d2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.745082 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.775247 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcp2\" (UniqueName: \"kubernetes.io/projected/357dbb1d-53bb-4dc2-9645-377146fed802-kube-api-access-fqcp2\") pod \"packageserver-d55dfcdfc-zts4g\" (UID: \"357dbb1d-53bb-4dc2-9645-377146fed802\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.788997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mp5\" (UniqueName: \"kubernetes.io/projected/6ffc6194-dadb-4607-a3c0-08dbdff1d476-kube-api-access-l6mp5\") pod \"service-ca-9c57cc56f-9wvm6\" (UID: \"6ffc6194-dadb-4607-a3c0-08dbdff1d476\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.799919 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.805681 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.806516 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xxspw"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.806772 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.808810 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.809137 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.309123737 +0000 UTC m=+116.818842703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.822358 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgsv\" (UniqueName: \"kubernetes.io/projected/3585ee2c-5c20-4a41-8909-1363e4554ccf-kube-api-access-szgsv\") pod \"migrator-59844c95c7-2mx5m\" (UID: \"3585ee2c-5c20-4a41-8909-1363e4554ccf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.848096 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fpmdm" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.850753 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.854013 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbbwx\" (UniqueName: \"kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx\") pod \"marketplace-operator-79b997595-hlxvz\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:36 crc kubenswrapper[4895]: W0320 13:23:36.864497 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b742e38_4844_4cd1_8523_9dd476bf87fa.slice/crio-8c1d0b1b560a4b6304f6118cf57217e25a179a2c854b1266809f37660e188079 WatchSource:0}: Error finding container 8c1d0b1b560a4b6304f6118cf57217e25a179a2c854b1266809f37660e188079: Status 404 returned error can't find the container with id 8c1d0b1b560a4b6304f6118cf57217e25a179a2c854b1266809f37660e188079 Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.874876 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrj6\" (UniqueName: \"kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6\") pod \"cni-sysctl-allowlist-ds-gm5sr\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.911807 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:36 crc kubenswrapper[4895]: E0320 13:23:36.912118 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.41210699 +0000 UTC m=+116.921825946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.918410 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.951101 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.977930 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" event={"ID":"64a35300-9f9f-44c7-a1ff-d818032e001a","Type":"ContainerStarted","Data":"fe5425d1d0ad848e1de834fdf844787c4268cbf06fd63aac0aeb5170cd345cfc"} Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.978033 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.982549 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.989907 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v"] Mar 20 13:23:36 crc kubenswrapper[4895]: I0320 13:23:36.994730 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.015131 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.015821 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.515803919 +0000 UTC m=+117.025522885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.017069 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.054489 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.075604 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" event={"ID":"a2980104-0602-4c36-8c7b-3877a591bc14","Type":"ContainerStarted","Data":"994d7f81ed253d415cf30337a773dbefff6dc09d5390c14e2897142574434d6b"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.078192 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wrj6w" event={"ID":"565b4975-d16b-4ce5-8200-a0700d9e9d4c","Type":"ContainerStarted","Data":"a9b4c69b971850e86543eb0a945f84911d166aa80d1c6a477d900246135a912b"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.098137 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" event={"ID":"4d6abede-a152-40a5-a419-838da91e1e7e","Type":"ContainerStarted","Data":"a3b03efaebdac942aace40e54daab83b3f30f06891f70f4e7af7534f94276f60"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.113272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" event={"ID":"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17","Type":"ContainerStarted","Data":"c3a4b5eb26333da75ed760687d3f1c3a26eb54f6ce9cb7c8b6ef101a8876046f"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.113310 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" event={"ID":"a55b1e0b-6071-4fcf-8ca4-f9931fab9b17","Type":"ContainerStarted","Data":"4d17adadd54264259906a802815b4c8b61f3f19a02fed1791b28f9f43343ae40"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.119856 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.120206 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.620193681 +0000 UTC m=+117.129912647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.138479 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.191240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" event={"ID":"8a8a5873-21a7-4e54-8492-a8b86a088023","Type":"ContainerStarted","Data":"ecf36dfdb6bce2d0e78919c92a133bac011875bb0b976dfe6a14942fe5887a45"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.191296 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" event={"ID":"8a8a5873-21a7-4e54-8492-a8b86a088023","Type":"ContainerStarted","Data":"c25fe86cd026334fbe85a06f12dcab8f34917945bd35f970e4e12d935bb9a07d"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.204562 4895 generic.go:334] "Generic (PLEG): container finished" podID="a3ba85f8-49ce-44f5-8a95-914fbcca3a8e" containerID="e6856d3ac6af5deee5a94dd2575a75c99bb274af42473e9e68d35f59b3bb6e92" exitCode=0 Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.204678 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" event={"ID":"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e","Type":"ContainerDied","Data":"e6856d3ac6af5deee5a94dd2575a75c99bb274af42473e9e68d35f59b3bb6e92"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.204713 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" event={"ID":"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e","Type":"ContainerStarted","Data":"7c16d94ab1cba4b0eeb3c449fba7ca5ba393fe41dd98d878d0be6988611e1444"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.207731 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hvtz7" event={"ID":"3b742e38-4844-4cd1-8523-9dd476bf87fa","Type":"ContainerStarted","Data":"8c1d0b1b560a4b6304f6118cf57217e25a179a2c854b1266809f37660e188079"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.220068 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.220157 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.221119 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.222266 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.722246404 +0000 UTC m=+117.231965360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.248147 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.248218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcm6r" event={"ID":"ed200aaa-4ed3-4e46-a934-9e97a94e0738","Type":"ContainerStarted","Data":"ef15dd471417bba1ececcec7f75a886f2d584f5e70e0f595148d3a9f45adbc1d"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.248264 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9hvwh"] Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.255664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" event={"ID":"22aa23b6-96e1-49b3-bbb9-d414e27df43b","Type":"ContainerStarted","Data":"b2b115de9081bf45e6604db72f1636d9c56cbbcb7224196e04aed6ac9b02e459"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.267147 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" event={"ID":"4005b37f-581a-4651-9dcb-f16414503616","Type":"ContainerStarted","Data":"5ee3cf1c83d6ef90645c4c8c60aa220a7ddc5e0a5685b6bbe68a7a7634681b9f"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.267194 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" event={"ID":"4005b37f-581a-4651-9dcb-f16414503616","Type":"ContainerStarted","Data":"eb2a62d87010c435e66f66469df5f08d503e34ac38134c7c9c0bec932df6a5fc"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.281531 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk"] Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.286041 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" event={"ID":"4db586ef-5bd4-45b9-af5d-825ac88a79e2","Type":"ContainerStarted","Data":"4f9ff15f4fe028566c5a9efcd0a273883d7162affbd1214f68c2ba60c434edff"} Mar 20 13:23:37 crc kubenswrapper[4895]: W0320 13:23:37.310583 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ff7fe59_fc6e_4a7f_826f_71f45302c074.slice/crio-41534623930c110b422d10053764c169c25021c0a50255056e55094a8cea3f52 WatchSource:0}: Error finding container 41534623930c110b422d10053764c169c25021c0a50255056e55094a8cea3f52: Status 404 returned error can't find the container with id 41534623930c110b422d10053764c169c25021c0a50255056e55094a8cea3f52 Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.310976 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" event={"ID":"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca","Type":"ContainerStarted","Data":"2c7f848b0446aa7fe0056eedd1b03da9b1080310f3498230fa88f7512013fb79"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.311023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" event={"ID":"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca","Type":"ContainerStarted","Data":"41de6371904d73b72e4282e53d197da84817cec80a4a9f14e99daf47446d0e5f"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.311033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" event={"ID":"2a86aab5-8a4e-44e0-a4be-e5f05857e3ca","Type":"ContainerStarted","Data":"695059c4d76785ceac0d9c03f75eb59bfb05b4913197e712d94639bf566b055f"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.317710 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" event={"ID":"bcebbd24-997c-44b3-bc3c-cedb2caf8e89","Type":"ContainerStarted","Data":"aea80132358b64e4a9fd525d0a9b5c8796c180385d566a31e549f997ffd7b4ad"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.321543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" event={"ID":"c1b23c96-a749-4ee0-b8d5-8619916be03a","Type":"ContainerStarted","Data":"abf16c98f96012a5675ea65d4aa02d3dc9d6de6d06a37d38235d89f4c7e4b2a4"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.321616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" event={"ID":"c1b23c96-a749-4ee0-b8d5-8619916be03a","Type":"ContainerStarted","Data":"a82e47886517799343d8ede37309cb75303c83cfd74c14bc31c49753c5abbe1c"} Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.322564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.322719 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.325155 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.825139046 +0000 UTC m=+117.334858012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.337333 4895 patch_prober.go:28] interesting pod/console-operator-58897d9998-vbwjf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.337425 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" podUID="c1b23c96-a749-4ee0-b8d5-8619916be03a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.340109 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.423429 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.426358 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.92633835 +0000 UTC m=+117.436057316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.426558 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.430082 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:37.929969438 +0000 UTC m=+117.439688404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.435184 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" podStartSLOduration=67.43516246 podStartE2EDuration="1m7.43516246s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:37.430695704 +0000 UTC m=+116.940414680" watchObservedRunningTime="2026-03-20 13:23:37.43516246 +0000 UTC m=+116.944881426" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.518765 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-psd4t" podStartSLOduration=67.518748433 podStartE2EDuration="1m7.518748433s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:37.498623746 +0000 UTC m=+117.008342712" watchObservedRunningTime="2026-03-20 13:23:37.518748433 +0000 UTC m=+117.028467389" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.529190 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.529518 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.029502645 +0000 UTC m=+117.539221611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.604871 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9czxh" podStartSLOduration=68.604855389 podStartE2EDuration="1m8.604855389s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:37.556715576 +0000 UTC m=+117.066434542" watchObservedRunningTime="2026-03-20 13:23:37.604855389 +0000 UTC m=+117.114574355" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.605805 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j"] Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.630977 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.631368 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.131352004 +0000 UTC m=+117.641070970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.732606 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.73259052 podStartE2EDuration="16.73259052s" podCreationTimestamp="2026-03-20 13:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:37.732516367 +0000 UTC m=+117.242235343" watchObservedRunningTime="2026-03-20 13:23:37.73259052 +0000 UTC m=+117.242309486" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.733662 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.734430 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.234414789 +0000 UTC m=+117.744133755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.773928 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz"] Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.838988 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" podStartSLOduration=67.838971105 podStartE2EDuration="1m7.838971105s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:37.835773436 +0000 UTC m=+117.345492402" watchObservedRunningTime="2026-03-20 13:23:37.838971105 +0000 UTC m=+117.348690071" Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.843070 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.843722 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.343707489 +0000 UTC m=+117.853426455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.853198 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm"] Mar 20 13:23:37 crc kubenswrapper[4895]: I0320 13:23:37.944145 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:37 crc kubenswrapper[4895]: E0320 13:23:37.944565 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.444549295 +0000 UTC m=+117.954268261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.039223 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mcm6r" podStartSLOduration=68.039203987 podStartE2EDuration="1m8.039203987s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.037701014 +0000 UTC m=+117.547419980" watchObservedRunningTime="2026-03-20 13:23:38.039203987 +0000 UTC m=+117.548922953" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.046084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.046479 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.546465915 +0000 UTC m=+118.056184881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.130994 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" podStartSLOduration=68.130979227 podStartE2EDuration="1m8.130979227s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.130300513 +0000 UTC m=+117.640019489" watchObservedRunningTime="2026-03-20 13:23:38.130979227 +0000 UTC m=+117.640698193" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.146711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.149716 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.649684922 +0000 UTC m=+118.159403888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.174584 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gtgnx" podStartSLOduration=69.174571252 podStartE2EDuration="1m9.174571252s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.173515989 +0000 UTC m=+117.683234955" watchObservedRunningTime="2026-03-20 13:23:38.174571252 +0000 UTC m=+117.684290208" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.248122 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.248185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.248716 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.748702749 +0000 UTC m=+118.258421715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.268186 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e899877b-fe80-4ace-9b35-41eb7302cf12-metrics-certs\") pod \"network-metrics-daemon-t9xh5\" (UID: \"e899877b-fe80-4ace-9b35-41eb7302cf12\") " pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.290733 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.304164 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q9wq4" podStartSLOduration=69.304144752 podStartE2EDuration="1m9.304144752s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.302354142 +0000 UTC m=+117.812073108" watchObservedRunningTime="2026-03-20 13:23:38.304144752 +0000 UTC m=+117.813863728" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.369780 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.370270 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4pshc" podStartSLOduration=69.370255915 podStartE2EDuration="1m9.370255915s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.363442687 +0000 UTC m=+117.873161653" watchObservedRunningTime="2026-03-20 13:23:38.370255915 +0000 UTC m=+117.879974881" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.370519 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:38.87050716 +0000 UTC m=+118.380226126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.395524 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fgv7q" event={"ID":"2fe0311f-3e9d-4749-b06c-a28d7d889c45","Type":"ContainerStarted","Data":"095eda94498deab3f9b683ef878a76fb3bb67c6da7a85e7be0cf413c96c4858a"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.498996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" event={"ID":"6ff7fe59-fc6e-4a7f-826f-71f45302c074","Type":"ContainerStarted","Data":"41534623930c110b422d10053764c169c25021c0a50255056e55094a8cea3f52"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.502159 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.503509 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.003486323 +0000 UTC m=+118.513205289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.503752 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" podStartSLOduration=68.503734409 podStartE2EDuration="1m8.503734409s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:38.47378049 +0000 UTC m=+117.983499456" watchObservedRunningTime="2026-03-20 13:23:38.503734409 +0000 UTC m=+118.013453375" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.526377 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t9xh5" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.544436 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.548332 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fpmdm"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.569154 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.585952 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7dpsb"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.602608 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.602890 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.102874698 +0000 UTC m=+118.612593664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.608293 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.626986 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" event={"ID":"bcebbd24-997c-44b3-bc3c-cedb2caf8e89","Type":"ContainerStarted","Data":"f81e4fb0c673d4447934ccee1e56efb841c140f9bdd65d0aab4131be457e05c1"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.647674 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" event={"ID":"a99cb22b-8c01-4a64-b512-8e4f61fb0558","Type":"ContainerStarted","Data":"492f0259199b521ec89855e01326a001104c7e93bddfefa5a11f481a919be768"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.658959 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" event={"ID":"1c06369a-5668-4982-aaba-acc71c4c4ce1","Type":"ContainerStarted","Data":"56ebc4d8d30f1882ae9bd9e8de0c7bb958f1c8129400a16728c31ba00493baa1"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.704246 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.704679 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.204666626 +0000 UTC m=+118.714385592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.711626 4895 generic.go:334] "Generic (PLEG): container finished" podID="22aa23b6-96e1-49b3-bbb9-d414e27df43b" containerID="6ef196321509fc6a7752bfbb2e110aed4b48c7da481c747a6a6ed1e474c92ecd" exitCode=0 Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.711719 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" event={"ID":"22aa23b6-96e1-49b3-bbb9-d414e27df43b","Type":"ContainerDied","Data":"6ef196321509fc6a7752bfbb2e110aed4b48c7da481c747a6a6ed1e474c92ecd"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.734467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" event={"ID":"0e827db1-1343-4055-9fb1-739b183fbf0a","Type":"ContainerStarted","Data":"1190942722e15bd091f435c9460e15d02c2e98d5f2f8469f3a8916bae5e2e773"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.767233 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.777112 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60154: no serving certificate available for the kubelet" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.804958 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.807604 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.307586646 +0000 UTC m=+118.817305612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.849987 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" event={"ID":"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc","Type":"ContainerStarted","Data":"1ae2e76643f0648f97d77a9c7716cebb973a34c7799e39c83e5ae79713f27abe"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.855922 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" event={"ID":"cedee9dd-d9ad-43b7-97aa-f709c9a59604","Type":"ContainerStarted","Data":"f5339bd206d341ffa83c5c37db0dfde4bc51055460093d8d197ee4347cd17eb4"} Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.880208 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60156: no serving certificate available for the kubelet" Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.881801 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.911183 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:38 crc kubenswrapper[4895]: E0320 13:23:38.911856 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.411844568 +0000 UTC m=+118.921563534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.946751 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74"] Mar 20 13:23:38 crc kubenswrapper[4895]: I0320 13:23:38.978787 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60172: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.004547 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-frk59"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.011929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" event={"ID":"11c4b94b-775c-473d-9c77-6597504fb4c8","Type":"ContainerStarted","Data":"c28376bf917ac2521c98b3a5ccb5aea71c4169582ff8c7e87ff64ea03fecd5b3"} Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.014249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.014959 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.514931912 +0000 UTC m=+119.024650878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: W0320 13:23:39.028834 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94398e3b_a910_4cd4_bb8a_2e599d39e8e4.slice/crio-a52c69da1cc75b903adb00b8413f03f317674af75f9abf988a9d263e4b8f8c7a WatchSource:0}: Error finding container a52c69da1cc75b903adb00b8413f03f317674af75f9abf988a9d263e4b8f8c7a: Status 404 returned error can't find the container with id a52c69da1cc75b903adb00b8413f03f317674af75f9abf988a9d263e4b8f8c7a Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.033149 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c6qbf"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.065476 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.073023 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.090119 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60184: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.090555 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wrj6w" event={"ID":"565b4975-d16b-4ce5-8200-a0700d9e9d4c","Type":"ContainerStarted","Data":"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64"} Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.115690 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.117293 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.616025235 +0000 UTC m=+119.125744201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.137666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" event={"ID":"64a35300-9f9f-44c7-a1ff-d818032e001a","Type":"ContainerStarted","Data":"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7"} Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.140218 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.173584 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wrj6w" podStartSLOduration=69.173548412 podStartE2EDuration="1m9.173548412s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:39.171239381 +0000 UTC m=+118.680958347" watchObservedRunningTime="2026-03-20 13:23:39.173548412 +0000 UTC m=+118.683267378" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.221350 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.221541 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.721521012 +0000 UTC m=+119.231239978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.221593 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" event={"ID":"259c146d-1ce3-4a6d-bf50-36315be6efae","Type":"ContainerStarted","Data":"57f97c9746724c4c422db4342bda3177fffb26acca0b93288d0629e701ebfe91"} Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.236717 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.236773 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.179500 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60200: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.276052 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60214: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.281701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.311508 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.811468612 +0000 UTC m=+119.321187578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.381722 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" podStartSLOduration=70.381699054 podStartE2EDuration="1m10.381699054s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:39.364933291 +0000 UTC m=+118.874652287" watchObservedRunningTime="2026-03-20 13:23:39.381699054 +0000 UTC m=+118.891418020" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.382915 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.386274 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60226: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.390000 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.889968893 +0000 UTC m=+119.399687859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.401708 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.403212 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:39.903195891 +0000 UTC m=+119.412914857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.411677 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.411726 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.411823 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.411850 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vbwjf" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.513458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.513940 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.013912131 +0000 UTC m=+119.523631097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.552205 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60234: no serving certificate available for the kubelet" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.566526 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.586179 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.586383 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" podUID="33d87434-561d-4397-94b6-1a96d6286361" containerName="route-controller-manager" containerID="cri-o://835f7b9ecf53abfe53a978d19f4ade769967324058c227d6c8bdc69877d9076d" gracePeriod=30 Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.611179 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wvm6"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.631337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.631700 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.131688715 +0000 UTC m=+119.641407681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.724121 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.733016 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.733164 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.233141734 +0000 UTC m=+119.742860700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.733518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.733901 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.23388785 +0000 UTC m=+119.743606816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.821606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.835020 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.835405 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.33537014 +0000 UTC m=+119.845089106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:39 crc kubenswrapper[4895]: W0320 13:23:39.835475 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3585ee2c_5c20_4a41_8909_1363e4554ccf.slice/crio-01328428019d6b922b8ed7cabf0dc7fbc509d5a09693d327149fae46210d4713 WatchSource:0}: Error finding container 01328428019d6b922b8ed7cabf0dc7fbc509d5a09693d327149fae46210d4713: Status 404 returned error can't find the container with id 01328428019d6b922b8ed7cabf0dc7fbc509d5a09693d327149fae46210d4713 Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.893624 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t9xh5"] Mar 20 13:23:39 crc kubenswrapper[4895]: I0320 13:23:39.937467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:39 crc kubenswrapper[4895]: E0320 13:23:39.937854 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.437840103 +0000 UTC m=+119.947559069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.040821 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.040943 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.540923227 +0000 UTC m=+120.050642193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.041063 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.041127 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.041276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.043148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.044668 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.543475043 +0000 UTC m=+120.053194009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.072844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.145311 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.145540 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.145571 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.150031 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.650000542 +0000 UTC m=+120.159719508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.158037 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.179196 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.235492 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.237969 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.245989 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.246514 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.246843 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.746832102 +0000 UTC m=+120.256551068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.261698 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.286806 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60240: no serving certificate available for the kubelet" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.308428 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" event={"ID":"fec1c815-e7e7-4367-b126-66d1ff806bf7","Type":"ContainerStarted","Data":"0a30ad18eba9d0b6bc5f31218c9dfdb5ca50d498d3ccc9afd827744ab58b6e60"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.308476 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" event={"ID":"fec1c815-e7e7-4367-b126-66d1ff806bf7","Type":"ContainerStarted","Data":"eaada42c84e65f448875b61620b84b890579c5d88f315e665155ab0965e0bf83"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.313821 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.318858 4895 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lj4nt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.318925 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" podUID="fec1c815-e7e7-4367-b126-66d1ff806bf7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.322859 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.344139 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" podStartSLOduration=70.344125132 podStartE2EDuration="1m10.344125132s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.342576588 +0000 UTC m=+119.852295554" watchObservedRunningTime="2026-03-20 13:23:40.344125132 +0000 UTC m=+119.853844098" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.350116 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.350518 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.850503589 +0000 UTC m=+120.360222545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.414934 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" event={"ID":"22aa23b6-96e1-49b3-bbb9-d414e27df43b","Type":"ContainerStarted","Data":"8c12ba9b935cd7bf473aa8f50bf469a874587e2b1767a0616add7e5aa416e1df"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.422190 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.422174613 podStartE2EDuration="422.174613ms" podCreationTimestamp="2026-03-20 13:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.372636389 +0000 UTC m=+119.882355355" watchObservedRunningTime="2026-03-20 13:23:40.422174613 +0000 UTC m=+119.931893579" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.437822 4895 generic.go:334] "Generic (PLEG): container finished" podID="33d87434-561d-4397-94b6-1a96d6286361" containerID="835f7b9ecf53abfe53a978d19f4ade769967324058c227d6c8bdc69877d9076d" exitCode=0 Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.437895 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" event={"ID":"33d87434-561d-4397-94b6-1a96d6286361","Type":"ContainerDied","Data":"835f7b9ecf53abfe53a978d19f4ade769967324058c227d6c8bdc69877d9076d"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.437930 4895 scope.go:117] "RemoveContainer" containerID="835f7b9ecf53abfe53a978d19f4ade769967324058c227d6c8bdc69877d9076d" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.438039 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.463483 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca\") pod \"33d87434-561d-4397-94b6-1a96d6286361\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464104 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert\") pod \"33d87434-561d-4397-94b6-1a96d6286361\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464161 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrw6w\" (UniqueName: \"kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w\") pod \"33d87434-561d-4397-94b6-1a96d6286361\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config\") pod \"33d87434-561d-4397-94b6-1a96d6286361\" (UID: \"33d87434-561d-4397-94b6-1a96d6286361\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464476 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca" (OuterVolumeSpecName: "client-ca") pod "33d87434-561d-4397-94b6-1a96d6286361" (UID: "33d87434-561d-4397-94b6-1a96d6286361"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464494 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.464642 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.465474 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config" (OuterVolumeSpecName: "config") pod "33d87434-561d-4397-94b6-1a96d6286361" (UID: "33d87434-561d-4397-94b6-1a96d6286361"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.465958 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:40.965944783 +0000 UTC m=+120.475663749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.498282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hvtz7" event={"ID":"3b742e38-4844-4cd1-8523-9dd476bf87fa","Type":"ContainerStarted","Data":"2c4b27af6d3808ae8b95cb1237cae608d2a83887d34f3a48fb3899e72c0d57ce"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.502008 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33d87434-561d-4397-94b6-1a96d6286361" (UID: "33d87434-561d-4397-94b6-1a96d6286361"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.505481 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w" (OuterVolumeSpecName: "kube-api-access-xrw6w") pod "33d87434-561d-4397-94b6-1a96d6286361" (UID: "33d87434-561d-4397-94b6-1a96d6286361"). InnerVolumeSpecName "kube-api-access-xrw6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.537670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" event={"ID":"bcebbd24-997c-44b3-bc3c-cedb2caf8e89","Type":"ContainerStarted","Data":"cf82a4f1504e8c861c478c3e483a11d72112b7ef5cf7b9f3e1f0efbec2720fbd"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.544229 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hvtz7" podStartSLOduration=7.5442130800000005 podStartE2EDuration="7.54421308s" podCreationTimestamp="2026-03-20 13:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.542178085 +0000 UTC m=+120.051897061" watchObservedRunningTime="2026-03-20 13:23:40.54421308 +0000 UTC m=+120.053932036" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.565473 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" event={"ID":"3585ee2c-5c20-4a41-8909-1363e4554ccf","Type":"ContainerStarted","Data":"01328428019d6b922b8ed7cabf0dc7fbc509d5a09693d327149fae46210d4713"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.566531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.566769 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrw6w\" (UniqueName: \"kubernetes.io/projected/33d87434-561d-4397-94b6-1a96d6286361-kube-api-access-xrw6w\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.566794 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.066776279 +0000 UTC m=+120.576495245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.566818 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d87434-561d-4397-94b6-1a96d6286361-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.566830 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d87434-561d-4397-94b6-1a96d6286361-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.569103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" event={"ID":"444aebdc-d867-44b7-9884-e0d89fea57d8","Type":"ContainerStarted","Data":"c51ce63f69819dd27c1a17f9fc4873f38376b68cef2652af8006221a827ab441"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.569125 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" event={"ID":"444aebdc-d867-44b7-9884-e0d89fea57d8","Type":"ContainerStarted","Data":"6b0cb6b651111fa9c6fee7f40a295c34ff9686e1faa9e022a44884069e888a3d"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.588460 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nrtwd" podStartSLOduration=70.588438068 podStartE2EDuration="1m10.588438068s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.5875818 +0000 UTC m=+120.097300766" watchObservedRunningTime="2026-03-20 13:23:40.588438068 +0000 UTC m=+120.098157034" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.592356 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" event={"ID":"210eb590-1075-4279-856f-0899b35e0021","Type":"ContainerStarted","Data":"5e7ed74e6a6baaae1c756e66fd98249e9eaefd98705696c0e1bdd703365c3ed9"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.596937 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" event={"ID":"210eb590-1075-4279-856f-0899b35e0021","Type":"ContainerStarted","Data":"53e1b7753f510b0d960e7103600653adbfec4c257e06a354445bb7a722f8a104"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.602587 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fgv7q" event={"ID":"2fe0311f-3e9d-4749-b06c-a28d7d889c45","Type":"ContainerStarted","Data":"818bd9e344256eca4f4bd1b85e5fe33d9d047ad45e8809a8d88b8e34bd02ab41"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.622624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" event={"ID":"11c4b94b-775c-473d-9c77-6597504fb4c8","Type":"ContainerStarted","Data":"5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.623025 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.631852 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" event={"ID":"c0003f4f-09f6-47f2-85bb-21cac67d07a5","Type":"ContainerStarted","Data":"e7c33903e5d143bf1d4648361b6edd463ce6a4be9036faafd6d251e6c2debf86"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.631901 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" event={"ID":"c0003f4f-09f6-47f2-85bb-21cac67d07a5","Type":"ContainerStarted","Data":"b1808e189305bf73d5af077578858a0a3fa7b8c97913f2e31d90394e24906cac"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.637016 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.637285 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" event={"ID":"4db586ef-5bd4-45b9-af5d-825ac88a79e2","Type":"ContainerStarted","Data":"0fab36506a3ef92d6692d02e3409af66add88b9c995d874efa98c165559f0b1d"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.641250 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" podStartSLOduration=71.641231123 podStartE2EDuration="1m11.641231123s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.640008576 +0000 UTC m=+120.149727542" watchObservedRunningTime="2026-03-20 13:23:40.641231123 +0000 UTC m=+120.150950089" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.647172 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:40 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:40 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:40 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.647223 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.668345 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.671068 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.171053779 +0000 UTC m=+120.680772745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.678370 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podStartSLOduration=7.678354818 podStartE2EDuration="7.678354818s" podCreationTimestamp="2026-03-20 13:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.667261527 +0000 UTC m=+120.176980493" watchObservedRunningTime="2026-03-20 13:23:40.678354818 +0000 UTC m=+120.188073784" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.678584 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.678766 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d87434-561d-4397-94b6-1a96d6286361" containerName="route-controller-manager" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.678797 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d87434-561d-4397-94b6-1a96d6286361" containerName="route-controller-manager" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.678902 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d87434-561d-4397-94b6-1a96d6286361" containerName="route-controller-manager" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.679248 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.699855 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.727790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7dpsb" event={"ID":"54f79e52-3f56-4f2b-a0b1-5a40f029633b","Type":"ContainerStarted","Data":"2a533bc5d87757db6b1471d8f9f3384cf4b55c1f6c85436133fcfb64395ef990"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.727834 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7dpsb" event={"ID":"54f79e52-3f56-4f2b-a0b1-5a40f029633b","Type":"ContainerStarted","Data":"4da8a7fd2f46d89ee7efc83dcf6b351f0cbb4f07d528cdd9cab7a7f05d82fbf3"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.742741 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fgv7q" podStartSLOduration=70.742724263 podStartE2EDuration="1m10.742724263s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.741706692 +0000 UTC m=+120.251425658" watchObservedRunningTime="2026-03-20 13:23:40.742724263 +0000 UTC m=+120.252443229" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.774175 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.774450 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.774470 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.774513 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.774533 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtzt\" (UniqueName: \"kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.775281 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.275263819 +0000 UTC m=+120.784982775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.781608 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" event={"ID":"a3ba85f8-49ce-44f5-8a95-914fbcca3a8e","Type":"ContainerStarted","Data":"e43e180c16c0335aacaffc88b0535a33b70e99d6f37793341a8b3f140c96bcab"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.783311 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.804101 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nzwd4" podStartSLOduration=70.804083434 podStartE2EDuration="1m10.804083434s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.766869467 +0000 UTC m=+120.276588433" watchObservedRunningTime="2026-03-20 13:23:40.804083434 +0000 UTC m=+120.313802400" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.846370 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.850437 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" podStartSLOduration=70.850420918 podStartE2EDuration="1m10.850420918s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.848435966 +0000 UTC m=+120.358154952" watchObservedRunningTime="2026-03-20 13:23:40.850420918 +0000 UTC m=+120.360139884" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.851282 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" podStartSLOduration=70.851274337 podStartE2EDuration="1m10.851274337s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.8066942 +0000 UTC m=+120.316413176" watchObservedRunningTime="2026-03-20 13:23:40.851274337 +0000 UTC m=+120.360993313" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.860525 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerStarted","Data":"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.860685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerStarted","Data":"a52c69da1cc75b903adb00b8413f03f317674af75f9abf988a9d263e4b8f8c7a"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.860908 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.863373 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" event={"ID":"617be708-08ca-4534-ae2d-2ae747070e51","Type":"ContainerStarted","Data":"c830ef45a614f17b9729af79276e7ac61524205f649fc346fcd074347befd253"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.878587 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.878650 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.878669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.878707 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.878727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtzt\" (UniqueName: \"kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.879840 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.379829396 +0000 UTC m=+120.889548362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.880874 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.881159 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" event={"ID":"6ff7fe59-fc6e-4a7f-826f-71f45302c074","Type":"ContainerStarted","Data":"7d6903665c005f870ea4ad8ab7e8dbe768832b592eea552455b3506ec245ac1a"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.888138 4895 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hlxvz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.888316 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.888620 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.900266 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" event={"ID":"0a0750f5-6779-49ca-ac7e-2b24526fbd5d","Type":"ContainerStarted","Data":"fa40f5a02db570120f1176e6f7573bba0b345f21ef822a7c5edbd1c6905eb281"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.902088 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" event={"ID":"6dda9b81-daca-4bc0-83e4-bad3b0f20dfc","Type":"ContainerStarted","Data":"6bc303389a237162e06097b90efc8f1c7615c6352fcbf48dbf6d046018c11941"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.902146 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.903182 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" event={"ID":"6ffc6194-dadb-4607-a3c0-08dbdff1d476","Type":"ContainerStarted","Data":"6f2604028a9968d7cb1599ebb07036cb22362a14c8e3556b492c9ec72414ce80"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.914892 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xxspw" podStartSLOduration=70.914873426 podStartE2EDuration="1m10.914873426s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.913947296 +0000 UTC m=+120.423666262" watchObservedRunningTime="2026-03-20 13:23:40.914873426 +0000 UTC m=+120.424592392" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.925205 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.958361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.961481 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" event={"ID":"8676ece2-b364-4b7e-a585-7aa514beb173","Type":"ContainerStarted","Data":"f84c94cf465bbdaffe0e02f3a5ebc36002448b5be9add959f3a51b5b758bd8c6"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.961895 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" event={"ID":"8676ece2-b364-4b7e-a585-7aa514beb173","Type":"ContainerStarted","Data":"aa8cf140e6de96336281f85a77aa8e8edd9822d529c88b47107f611b19c4b007"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.973025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t9xh5" event={"ID":"e899877b-fe80-4ace-9b35-41eb7302cf12","Type":"ContainerStarted","Data":"5df6355499b0d696dfdcb11052a9514b721f63d46d3328498fed5068a45c00d6"} Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.981505 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtzt\" (UniqueName: \"kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt\") pod \"route-controller-manager-5b59ddbb88-29t8x\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.982096 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:40 crc kubenswrapper[4895]: E0320 13:23:40.983470 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.483454433 +0000 UTC m=+120.993173399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:40 crc kubenswrapper[4895]: I0320 13:23:40.991836 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" event={"ID":"bc0a8d83-a2d4-4231-a024-85e6cf31955c","Type":"ContainerStarted","Data":"c9151fb90edc137328c78092b119e7f9398208e93a5947346b044a03ea4598d1"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.008096 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z72zk" podStartSLOduration=71.008073697 podStartE2EDuration="1m11.008073697s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:40.949985727 +0000 UTC m=+120.459704703" watchObservedRunningTime="2026-03-20 13:23:41.008073697 +0000 UTC m=+120.517792653" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.019051 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" event={"ID":"cedee9dd-d9ad-43b7-97aa-f709c9a59604","Type":"ContainerStarted","Data":"ad970c2d8a59f21eaf504b23c8bd208c5c734f84ffafa8c659227d7ebc695ad8"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.028990 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.055513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" event={"ID":"357dbb1d-53bb-4dc2-9645-377146fed802","Type":"ContainerStarted","Data":"6e46a3483a069087dd7302f09a53adc4aa471424a033771fedee25bf00b9fe1e"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.055592 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" event={"ID":"357dbb1d-53bb-4dc2-9645-377146fed802","Type":"ContainerStarted","Data":"0034e705dbc3d363e01f788509438b4db2b758d40d2c020c639701f45aa740a1"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.056210 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.060983 4895 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zts4g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.061095 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" podUID="357dbb1d-53bb-4dc2-9645-377146fed802" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.083510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.084760 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.584742409 +0000 UTC m=+121.094461375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.099698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" event={"ID":"4005b37f-581a-4651-9dcb-f16414503616","Type":"ContainerStarted","Data":"ba30f8c88b06385034747306bc16a99b12130e896cb4b295f3aca4aa28b5fb9f"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.218605 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" podStartSLOduration=71.218582651 podStartE2EDuration="1m11.218582651s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.090478394 +0000 UTC m=+120.600197360" watchObservedRunningTime="2026-03-20 13:23:41.218582651 +0000 UTC m=+120.728301617" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.223677 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" event={"ID":"b79ea384-24f2-4134-9f53-3f6764f1a0d2","Type":"ContainerStarted","Data":"edbb42bb94790710807be56eb3b9d9e686b2c2cb4bec68c52b545dff1e8f1525"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.223721 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" event={"ID":"b79ea384-24f2-4134-9f53-3f6764f1a0d2","Type":"ContainerStarted","Data":"500f4ced0ffe68c05830f0b5b5994731dd8edbeef2e86c86e1f9203e34a4f0f1"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.229197 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.239122 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.739091876 +0000 UTC m=+121.248810842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.246776 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.252606 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.74899291 +0000 UTC m=+121.258711876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.264642 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdh8v" podStartSLOduration=71.264618709 podStartE2EDuration="1m11.264618709s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.228805673 +0000 UTC m=+120.738524639" watchObservedRunningTime="2026-03-20 13:23:41.264618709 +0000 UTC m=+120.774337675" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.344217 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" podStartSLOduration=71.344201745 podStartE2EDuration="1m11.344201745s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.267242566 +0000 UTC m=+120.776961542" watchObservedRunningTime="2026-03-20 13:23:41.344201745 +0000 UTC m=+120.853920711" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.353726 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" containerID="cri-o://78aff67b89b646349bb8d0f231a6a873784529221d5a59a18d1d1f0991fc987d" gracePeriod=30 Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.354097 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.356690 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.856663005 +0000 UTC m=+121.366381991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.376299 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" podStartSLOduration=71.37627431 podStartE2EDuration="1m11.37627431s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.353127768 +0000 UTC m=+120.862846734" watchObservedRunningTime="2026-03-20 13:23:41.37627431 +0000 UTC m=+120.885993276" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.455950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.457017 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:41.957006181 +0000 UTC m=+121.466725147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.464599 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7fjj4" podStartSLOduration=71.464580925 podStartE2EDuration="1m11.464580925s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.45514708 +0000 UTC m=+120.964866036" watchObservedRunningTime="2026-03-20 13:23:41.464580925 +0000 UTC m=+120.974299891" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.520680 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" podStartSLOduration=71.52066021 podStartE2EDuration="1m11.52066021s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.520302083 +0000 UTC m=+121.030021049" watchObservedRunningTime="2026-03-20 13:23:41.52066021 +0000 UTC m=+121.030379176" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.557409 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zc2zz" podStartSLOduration=71.557374946 podStartE2EDuration="1m11.557374946s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.552729616 +0000 UTC m=+121.062448582" watchObservedRunningTime="2026-03-20 13:23:41.557374946 +0000 UTC m=+121.067093912" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.559358 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.560080 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.060049164 +0000 UTC m=+121.569768130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.561566 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.561958 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.561988 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562060 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wchf2" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562082 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" event={"ID":"259c146d-1ce3-4a6d-bf50-36315be6efae","Type":"ContainerStarted","Data":"6839998608aa933cd7b111a304191a2938a5266f048f537f2f246035e21e2584"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562099 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" event={"ID":"259c146d-1ce3-4a6d-bf50-36315be6efae","Type":"ContainerStarted","Data":"072e3cca491f359c1783ba4a5a295ce95b0b240945fc809acc60b72748ba2ca9"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fpmdm" event={"ID":"27950301-9838-48a8-a0d9-83a4083d2e0d","Type":"ContainerStarted","Data":"77a337cccff02c1f02059ace1aa27dcae338c67b8cb018d89e88ed6e85431bb1"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fpmdm" event={"ID":"27950301-9838-48a8-a0d9-83a4083d2e0d","Type":"ContainerStarted","Data":"f1591a9d5c8d1c4eb52d45cffa2fcfa0d55fd6a818f5b5a48da9e903afdced56"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562137 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" event={"ID":"1c06369a-5668-4982-aaba-acc71c4c4ce1","Type":"ContainerStarted","Data":"225449bf0c2daf4f9c9c382b10e91c6b4c3e0e867134e446204c4fb10dc2643d"} Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.562151 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" event={"ID":"0e827db1-1343-4055-9fb1-739b183fbf0a","Type":"ContainerStarted","Data":"7bba3f3a27d1413a1fc639c7eb58f38b179056a7da88c671a3e83034a50b1eb2"} Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.575050 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.075031089 +0000 UTC m=+121.584750055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.649096 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60242: no serving certificate available for the kubelet" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.658540 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:41 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:41 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:41 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.658601 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.660339 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qml88" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.663524 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.664351 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.164329556 +0000 UTC m=+121.674048522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.715444 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" podStartSLOduration=71.715414873 podStartE2EDuration="1m11.715414873s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.649885392 +0000 UTC m=+121.159604358" watchObservedRunningTime="2026-03-20 13:23:41.715414873 +0000 UTC m=+121.225133839" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.754261 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-px7gz" podStartSLOduration=71.754234905 podStartE2EDuration="1m11.754234905s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.747711953 +0000 UTC m=+121.257430919" watchObservedRunningTime="2026-03-20 13:23:41.754234905 +0000 UTC m=+121.263953871" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.765094 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.765421 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.265408947 +0000 UTC m=+121.775127913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.840787 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" podStartSLOduration=71.840770621 podStartE2EDuration="1m11.840770621s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.840166668 +0000 UTC m=+121.349885634" watchObservedRunningTime="2026-03-20 13:23:41.840770621 +0000 UTC m=+121.350489587" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.859700 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.866893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.867191 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.367176513 +0000 UTC m=+121.876895479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.875479 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fpmdm" podStartSLOduration=8.875461133 podStartE2EDuration="8.875461133s" podCreationTimestamp="2026-03-20 13:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.873432179 +0000 UTC m=+121.383151155" watchObservedRunningTime="2026-03-20 13:23:41.875461133 +0000 UTC m=+121.385180099" Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.899696 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.903084 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xc2mg"] Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.936858 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.936843094 podStartE2EDuration="936.843094ms" podCreationTimestamp="2026-03-20 13:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.934972883 +0000 UTC m=+121.444691849" watchObservedRunningTime="2026-03-20 13:23:41.936843094 +0000 UTC m=+121.446562050" Mar 20 13:23:41 crc kubenswrapper[4895]: W0320 13:23:41.945208 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa5fe02_05bc_4cd6_bf59_6d595a5ca5ff.slice/crio-6e7723e3189b95b393782039e8aa3c1936d68b7f761f1f6a937afd820c78286d WatchSource:0}: Error finding container 6e7723e3189b95b393782039e8aa3c1936d68b7f761f1f6a937afd820c78286d: Status 404 returned error can't find the container with id 6e7723e3189b95b393782039e8aa3c1936d68b7f761f1f6a937afd820c78286d Mar 20 13:23:41 crc kubenswrapper[4895]: I0320 13:23:41.968027 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:41 crc kubenswrapper[4895]: E0320 13:23:41.968356 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.468344287 +0000 UTC m=+121.978063253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.027430 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-klgkl" podStartSLOduration=72.027379747 podStartE2EDuration="1m12.027379747s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:41.965911394 +0000 UTC m=+121.475630360" watchObservedRunningTime="2026-03-20 13:23:42.027379747 +0000 UTC m=+121.537098723" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.041127 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-gm5sr"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.081436 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.081738 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.581721925 +0000 UTC m=+122.091440891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.131539 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5kx2j" podStartSLOduration=72.131524605 podStartE2EDuration="1m12.131524605s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:42.12900167 +0000 UTC m=+121.638720636" watchObservedRunningTime="2026-03-20 13:23:42.131524605 +0000 UTC m=+121.641243571" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.183689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.184126 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.684107265 +0000 UTC m=+122.193826231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.289782 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.293002 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.792975046 +0000 UTC m=+122.302694012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.393928 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.394212 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.89419957 +0000 UTC m=+122.403918536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.425259 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t9xh5" event={"ID":"e899877b-fe80-4ace-9b35-41eb7302cf12","Type":"ContainerStarted","Data":"afff94d71c02f0093ee3d50f868efc921cdb4bf8036043eb5f759f617d66870d"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.426791 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.429698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4fe1ccde466a340c37fede3a99be89489c1e3d0e0a33c74d4c391870e9ee87ca"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.429786 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.429799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bc67d7403b6504455b98a442afd74bf4c59eb53a85a230f787e5f0f5110d5cdf"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.429849 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fcf1dc9633c6ff8afbd1aecc86e73b75bbc24ae89b44c27cb13d665210f3a011"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.440932 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.447940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" event={"ID":"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff","Type":"ContainerStarted","Data":"6e7723e3189b95b393782039e8aa3c1936d68b7f761f1f6a937afd820c78286d"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.458244 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.466024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-frk59" event={"ID":"b79ea384-24f2-4134-9f53-3f6764f1a0d2","Type":"ContainerStarted","Data":"08a0e401f671b0ded91ac4b7244e67b75d9e6e0e76addf7dafce98e07a36c856"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.477036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7dpsb" event={"ID":"54f79e52-3f56-4f2b-a0b1-5a40f029633b","Type":"ContainerStarted","Data":"804b04f07b2ed9bb0ebc3c5c08e40937552ad9df5c28e05e66f5463deed50289"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.478257 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.487667 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" event={"ID":"617be708-08ca-4534-ae2d-2ae747070e51","Type":"ContainerStarted","Data":"765a867cc48822e4b8997c80f847aae4eb109f470cc6b00fbe8517f955bb8b94"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.487727 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" event={"ID":"617be708-08ca-4534-ae2d-2ae747070e51","Type":"ContainerStarted","Data":"89413bfa9a966ba46078f15e49fbb2941c8383d23a9f0fb18e49d09d6f73ea02"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.494833 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.495062 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.495144 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.495184 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cw9j\" (UniqueName: \"kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.495346 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:42.995326712 +0000 UTC m=+122.505045678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.496703 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" event={"ID":"3585ee2c-5c20-4a41-8909-1363e4554ccf","Type":"ContainerStarted","Data":"0d403f2741546d1e3859a40f1c95426abf21241c6e34ec45f4373f0a031f79a3"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.496747 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" event={"ID":"3585ee2c-5c20-4a41-8909-1363e4554ccf","Type":"ContainerStarted","Data":"05b70df3e6b951f620f04bfa127318e49f69e7b7a8aaedb04e3943727039cac2"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.536422 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7dpsb" podStartSLOduration=9.536372852 podStartE2EDuration="9.536372852s" podCreationTimestamp="2026-03-20 13:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:42.536164308 +0000 UTC m=+122.045883274" watchObservedRunningTime="2026-03-20 13:23:42.536372852 +0000 UTC m=+122.046091818" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.541014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnv8x" event={"ID":"210eb590-1075-4279-856f-0899b35e0021","Type":"ContainerStarted","Data":"628ddca426f1a4c2e6a0ac3df3a2110a71d55c11b7dc522da5dbaa476297178e"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.554554 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"614fedfbcc631da86f9ff1f32db082e6a7b60421f48f8e46c6c6c024fad094d7"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.554921 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.572698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" event={"ID":"a99cb22b-8c01-4a64-b512-8e4f61fb0558","Type":"ContainerStarted","Data":"5caf6190d2359b535f8a4cfc632be6b423b883e464ebb61694cc7ed11d731c06"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.583104 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.590698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9wvm6" event={"ID":"6ffc6194-dadb-4607-a3c0-08dbdff1d476","Type":"ContainerStarted","Data":"7aadd9fe0660163a5c1b8dfd8e380bf7ce762f78a0dbfd9b21afbe960597fe68"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.597726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.597818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.597867 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.597919 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cw9j\" (UniqueName: \"kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.600835 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.100821509 +0000 UTC m=+122.610540475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.601619 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.601766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.617364 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" event={"ID":"0a0750f5-6779-49ca-ac7e-2b24526fbd5d","Type":"ContainerStarted","Data":"09a4fc188c80c796e3adf31913997dd9450d43aff05a416852b4064a23de8fcd"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.617421 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" event={"ID":"0a0750f5-6779-49ca-ac7e-2b24526fbd5d","Type":"ContainerStarted","Data":"c383530200340981e13eaf62c1b0b5dddd807aad7034a7ee76b214b786d7e71f"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.621431 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.621933 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.621952 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.622028 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerName="controller-manager" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.622680 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.631627 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.632201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" event={"ID":"22aa23b6-96e1-49b3-bbb9-d414e27df43b","Type":"ContainerStarted","Data":"20b7ef932c51c73b8d3b70964bb11f9f3ef3805b22911da9757757e26bb8b9f0"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.639825 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:42 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:42 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:42 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.639874 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.642762 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.651708 4895 generic.go:334] "Generic (PLEG): container finished" podID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" containerID="78aff67b89b646349bb8d0f231a6a873784529221d5a59a18d1d1f0991fc987d" exitCode=0 Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.651812 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" event={"ID":"accdc13f-9bde-41cd-8caf-e1aafdf7d913","Type":"ContainerDied","Data":"78aff67b89b646349bb8d0f231a6a873784529221d5a59a18d1d1f0991fc987d"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.651843 4895 scope.go:117] "RemoveContainer" containerID="78aff67b89b646349bb8d0f231a6a873784529221d5a59a18d1d1f0991fc987d" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.651976 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2fv7p" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.652863 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cw9j\" (UniqueName: \"kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j\") pod \"certified-operators-tdcr2\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.656096 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.656664 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.671355 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2mx5m" podStartSLOduration=72.671340809 podStartE2EDuration="1m12.671340809s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:42.669035559 +0000 UTC m=+122.178754525" watchObservedRunningTime="2026-03-20 13:23:42.671340809 +0000 UTC m=+122.181059775" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.694617 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2dq2z" event={"ID":"bc0a8d83-a2d4-4231-a024-85e6cf31955c","Type":"ContainerStarted","Data":"1ec014ae9996965ce9c121a2a173d613048799bd8c0c355e5b1d98141bc5e341"} Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.696855 4895 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hlxvz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.696916 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.710378 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca\") pod \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.710453 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles\") pod \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.710476 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config\") pod \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.711979 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "accdc13f-9bde-41cd-8caf-e1aafdf7d913" (UID: "accdc13f-9bde-41cd-8caf-e1aafdf7d913"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.713336 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.713572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca" (OuterVolumeSpecName: "client-ca") pod "accdc13f-9bde-41cd-8caf-e1aafdf7d913" (UID: "accdc13f-9bde-41cd-8caf-e1aafdf7d913"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.714237 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config" (OuterVolumeSpecName: "config") pod "accdc13f-9bde-41cd-8caf-e1aafdf7d913" (UID: "accdc13f-9bde-41cd-8caf-e1aafdf7d913"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.726469 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lj4nt" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.727567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.727617 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert\") pod \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.727677 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8b8q\" (UniqueName: \"kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q\") pod \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\" (UID: \"accdc13f-9bde-41cd-8caf-e1aafdf7d913\") " Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.728030 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.228009647 +0000 UTC m=+122.737728613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.728384 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733428 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghwhb\" (UniqueName: \"kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcd5w\" (UniqueName: \"kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733526 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733744 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733766 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733801 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733972 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.733987 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.734001 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accdc13f-9bde-41cd-8caf-e1aafdf7d913-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.735868 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.235845848 +0000 UTC m=+122.745564804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.739772 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "accdc13f-9bde-41cd-8caf-e1aafdf7d913" (UID: "accdc13f-9bde-41cd-8caf-e1aafdf7d913"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.740987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q" (OuterVolumeSpecName: "kube-api-access-b8b8q") pod "accdc13f-9bde-41cd-8caf-e1aafdf7d913" (UID: "accdc13f-9bde-41cd-8caf-e1aafdf7d913"). InnerVolumeSpecName "kube-api-access-b8b8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.768027 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.786197 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c6qbf" podStartSLOduration=72.783828818 podStartE2EDuration="1m12.783828818s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:42.733351543 +0000 UTC m=+122.243070519" watchObservedRunningTime="2026-03-20 13:23:42.783828818 +0000 UTC m=+122.293547784" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.798522 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" podStartSLOduration=73.798506516 podStartE2EDuration="1m13.798506516s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:42.795793968 +0000 UTC m=+122.305512934" watchObservedRunningTime="2026-03-20 13:23:42.798506516 +0000 UTC m=+122.308225502" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.825654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.826523 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.838081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.838554 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.338528434 +0000 UTC m=+122.848247400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghwhb\" (UniqueName: \"kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcd5w\" (UniqueName: \"kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840572 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840824 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840852 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.840942 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.841331 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.842762 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.860589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.864177 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.872554 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.873253 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.373239736 +0000 UTC m=+122.882958702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.873713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.874235 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accdc13f-9bde-41cd-8caf-e1aafdf7d913-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.874259 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8b8q\" (UniqueName: \"kubernetes.io/projected/accdc13f-9bde-41cd-8caf-e1aafdf7d913-kube-api-access-b8b8q\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.874844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.905495 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.911479 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcd5w\" (UniqueName: \"kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w\") pod \"controller-manager-8676c49c64-p42fk\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.913442 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghwhb\" (UniqueName: \"kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb\") pod \"community-operators-qpm8f\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.968841 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.977424 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.977643 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.977696 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:42 crc kubenswrapper[4895]: I0320 13:23:42.977726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p84sx\" (UniqueName: \"kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:42 crc kubenswrapper[4895]: E0320 13:23:42.977816 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.477801264 +0000 UTC m=+122.987520230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.033772 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2qwtb"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.033910 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.045311 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.062871 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qwtb"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.062371 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hjc74" podStartSLOduration=73.062357387 podStartE2EDuration="1m13.062357387s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:43.05881308 +0000 UTC m=+122.568532046" watchObservedRunningTime="2026-03-20 13:23:43.062357387 +0000 UTC m=+122.572076343" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.079751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.080196 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.080298 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.080426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p84sx\" (UniqueName: \"kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.081230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.081637 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.581615004 +0000 UTC m=+123.091333970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.082106 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.114051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zts4g" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.134071 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p84sx\" (UniqueName: \"kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx\") pod \"certified-operators-kd8jz\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.181069 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.181287 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct47f\" (UniqueName: \"kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.181338 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.181431 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.181543 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.68152731 +0000 UTC m=+123.191246266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.185310 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.201141 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2fv7p"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.211046 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.238835 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d87434-561d-4397-94b6-1a96d6286361" path="/var/lib/kubelet/pods/33d87434-561d-4397-94b6-1a96d6286361/volumes" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.239620 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="accdc13f-9bde-41cd-8caf-e1aafdf7d913" path="/var/lib/kubelet/pods/accdc13f-9bde-41cd-8caf-e1aafdf7d913/volumes" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.285441 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.285484 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.285514 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct47f\" (UniqueName: \"kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.285546 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.285905 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.286107 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.786097408 +0000 UTC m=+123.295816374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.286441 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.335406 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct47f\" (UniqueName: \"kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f\") pod \"community-operators-2qwtb\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.386693 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.386958 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.886943674 +0000 UTC m=+123.396662640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.403689 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.491668 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.492113 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:43.992079264 +0000 UTC m=+123.501798230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.593309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.593612 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.093598015 +0000 UTC m=+123.603316981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.651016 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:43 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:43 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:43 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.651075 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.686865 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.689273 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4af527d6600081e4f440d1468fb027adc6e3f33bf05dc9a3838f34224c397b59"} Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.698951 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.705296 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.205272766 +0000 UTC m=+123.714991722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.705877 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t9xh5" event={"ID":"e899877b-fe80-4ace-9b35-41eb7302cf12","Type":"ContainerStarted","Data":"9ff89daf1777c9644727702632f5810e25c4c98269355bc2f7d13fae4a41d58f"} Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.730961 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8bee6b0689fe9ae2c2ed8a5edc9c316216f7fcb417ee9899f5c3aba8ab7689b2"} Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.742330 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t9xh5" podStartSLOduration=74.74231393 podStartE2EDuration="1m14.74231393s" podCreationTimestamp="2026-03-20 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:43.742094135 +0000 UTC m=+123.251813101" watchObservedRunningTime="2026-03-20 13:23:43.74231393 +0000 UTC m=+123.252032896" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.752708 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" event={"ID":"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff","Type":"ContainerStarted","Data":"b2bb5df726ff567d2d891091fbd958775333b8f2fa054d7c482d19de6d1200da"} Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.753801 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.774866 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.801967 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" podStartSLOduration=3.801941122 podStartE2EDuration="3.801941122s" podCreationTimestamp="2026-03-20 13:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:43.794892459 +0000 UTC m=+123.304611425" watchObservedRunningTime="2026-03-20 13:23:43.801941122 +0000 UTC m=+123.311660088" Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.808015 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.808634 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.308606986 +0000 UTC m=+123.818325952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.819704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" event={"ID":"a99cb22b-8c01-4a64-b512-8e4f61fb0558","Type":"ContainerStarted","Data":"670a94fc13207eb63b7ad7233f1f4c4ff89427571d4f85ee992e05c0218cecca"} Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.825237 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" gracePeriod=30 Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.842452 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.849252 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:23:43 crc kubenswrapper[4895]: W0320 13:23:43.877278 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931af0a9_a596_4b43_b788_107bb6d266ce.slice/crio-cd991533fb4b9d0dcfb99663a400fccb72ebab2b8003aa9fe8c1ea6668a329d7 WatchSource:0}: Error finding container cd991533fb4b9d0dcfb99663a400fccb72ebab2b8003aa9fe8c1ea6668a329d7: Status 404 returned error can't find the container with id cd991533fb4b9d0dcfb99663a400fccb72ebab2b8003aa9fe8c1ea6668a329d7 Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.911873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:43 crc kubenswrapper[4895]: E0320 13:23:43.916532 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.416516966 +0000 UTC m=+123.926235932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.930365 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:23:43 crc kubenswrapper[4895]: I0320 13:23:43.992125 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qwtb"] Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.000125 4895 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:23:44 crc kubenswrapper[4895]: W0320 13:23:44.018449 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71969a9_97c9_46c4_9e1c_051f3c86ae91.slice/crio-3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7 WatchSource:0}: Error finding container 3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7: Status 404 returned error can't find the container with id 3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.018826 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.019115 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.51909882 +0000 UTC m=+124.028817786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.122199 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.122618 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.622600694 +0000 UTC m=+124.132319660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.223875 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.224221 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.724206997 +0000 UTC m=+124.233925963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.255914 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60248: no serving certificate available for the kubelet" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.325903 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.326305 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.826286281 +0000 UTC m=+124.336005247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.426808 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.427208 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:44.927193269 +0000 UTC m=+124.436912235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.528164 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.528477 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:23:45.028465534 +0000 UTC m=+124.538184500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2tqkj" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.599807 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.601022 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.603103 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.609914 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.628867 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:44 crc kubenswrapper[4895]: E0320 13:23:44.629190 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:45.129175208 +0000 UTC m=+124.638894174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.638281 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:44 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:44 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:44 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.638319 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.663254 4895 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:23:44.000511167Z","Handler":null,"Name":""} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.667850 4895 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.667885 4895 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.730624 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.730674 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.730816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvrcl\" (UniqueName: \"kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.730857 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.734358 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.734425 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.761064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2tqkj\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.829970 4895 generic.go:334] "Generic (PLEG): container finished" podID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerID="a341884ee1ddc62e5105651f348685cfbb7e576c10ae68ecc99cb3db814cad4b" exitCode=0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.830248 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerDied","Data":"a341884ee1ddc62e5105651f348685cfbb7e576c10ae68ecc99cb3db814cad4b"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.830294 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerStarted","Data":"dbb1738481161f0882fef38e1bad6c14a79d9503baf2d0f304342f6e75379640"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.831845 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832230 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832449 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832509 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvrcl\" (UniqueName: \"kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832529 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832844 4895 generic.go:334] "Generic (PLEG): container finished" podID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerID="feba80ca1f4ff09ebaff062bc0c43d8104d95c72ba532a89192c2e95cef8f601" exitCode=0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832907 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerDied","Data":"feba80ca1f4ff09ebaff062bc0c43d8104d95c72ba532a89192c2e95cef8f601"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.832964 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerStarted","Data":"3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.833102 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.835425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" event={"ID":"931af0a9-a596-4b43-b788-107bb6d266ce","Type":"ContainerStarted","Data":"2b39e932f3dd0fab27b13cb44927f68962935e581261b327e1a6e8d81d5d3daa"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.835455 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" event={"ID":"931af0a9-a596-4b43-b788-107bb6d266ce","Type":"ContainerStarted","Data":"cd991533fb4b9d0dcfb99663a400fccb72ebab2b8003aa9fe8c1ea6668a329d7"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.835737 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.840871 4895 generic.go:334] "Generic (PLEG): container finished" podID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerID="4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a" exitCode=0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.840964 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerDied","Data":"4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.840996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerStarted","Data":"b370d16710485612953694882ed525e618ca6e1a017b56efbed37add63e7706f"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.843810 4895 generic.go:334] "Generic (PLEG): container finished" podID="444aebdc-d867-44b7-9884-e0d89fea57d8" containerID="c51ce63f69819dd27c1a17f9fc4873f38376b68cef2652af8006221a827ab441" exitCode=0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.843867 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" event={"ID":"444aebdc-d867-44b7-9884-e0d89fea57d8","Type":"ContainerDied","Data":"c51ce63f69819dd27c1a17f9fc4873f38376b68cef2652af8006221a827ab441"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.844518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.846321 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.848735 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" event={"ID":"a99cb22b-8c01-4a64-b512-8e4f61fb0558","Type":"ContainerStarted","Data":"97215105221d9f523c58bf4f43d20811e79b48393b033bd5e4aca9244bcf65d1"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.848770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" event={"ID":"a99cb22b-8c01-4a64-b512-8e4f61fb0558","Type":"ContainerStarted","Data":"849e6efbf3b0cad0cf2b9cdbdcf1628a93cc4110087897756a16613b91040731"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.850409 4895 generic.go:334] "Generic (PLEG): container finished" podID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerID="e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f" exitCode=0 Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.850552 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerDied","Data":"e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.850575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerStarted","Data":"fa42e903ce40667fbdc0d5481bb3c1b35d83111daccd5873f825cee9c59a5b9d"} Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.867616 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvrcl\" (UniqueName: \"kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl\") pod \"redhat-marketplace-gn8m7\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.897100 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" podStartSLOduration=4.8970832170000005 podStartE2EDuration="4.897083217s" podCreationTimestamp="2026-03-20 13:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:44.89631975 +0000 UTC m=+124.406038726" watchObservedRunningTime="2026-03-20 13:23:44.897083217 +0000 UTC m=+124.406802183" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.915307 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.988531 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9hvwh" podStartSLOduration=11.988510079 podStartE2EDuration="11.988510079s" podCreationTimestamp="2026-03-20 13:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:44.987584918 +0000 UTC m=+124.497303894" watchObservedRunningTime="2026-03-20 13:23:44.988510079 +0000 UTC m=+124.498229045" Mar 20 13:23:44 crc kubenswrapper[4895]: I0320 13:23:44.999735 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.013149 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.025471 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.025686 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.034623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.137911 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.138250 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.138287 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9h6f\" (UniqueName: \"kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.181055 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:23:45 crc kubenswrapper[4895]: W0320 13:23:45.198039 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d9f9c9_84fa_40b3_95fe_dd2f821c1262.slice/crio-8b81a0a1feb8d5d2e1b34f5bf7cd7152aba9b373e841540c9de7bbb6b72efd29 WatchSource:0}: Error finding container 8b81a0a1feb8d5d2e1b34f5bf7cd7152aba9b373e841540c9de7bbb6b72efd29: Status 404 returned error can't find the container with id 8b81a0a1feb8d5d2e1b34f5bf7cd7152aba9b373e841540c9de7bbb6b72efd29 Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.230595 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.244153 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.244195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.244239 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9h6f\" (UniqueName: \"kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.244923 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.245132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.279378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9h6f\" (UniqueName: \"kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f\") pod \"redhat-marketplace-ltcwm\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.291035 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.339885 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.525061 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.525126 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.525187 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.525259 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.598022 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.599048 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.601210 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.612493 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.639892 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:45 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:45 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:45 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.639963 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.668247 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:23:45 crc kubenswrapper[4895]: W0320 13:23:45.678408 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50352717_2200_417f_b1ff_7e9adbe0cbf8.slice/crio-91b08e7fd4a16d7dfde34cb2bf6d5a940d441e902bdd591920d62f102f9e6ee8 WatchSource:0}: Error finding container 91b08e7fd4a16d7dfde34cb2bf6d5a940d441e902bdd591920d62f102f9e6ee8: Status 404 returned error can't find the container with id 91b08e7fd4a16d7dfde34cb2bf6d5a940d441e902bdd591920d62f102f9e6ee8 Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.753347 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzpdt\" (UniqueName: \"kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.753445 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.753513 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.769133 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.769184 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.791946 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.807098 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.807132 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.807857 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.808770 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.832386 4895 patch_prober.go:28] interesting pod/console-f9d7485db-wrj6w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.832490 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wrj6w" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.833230 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.833448 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.842474 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.856485 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.856563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzpdt\" (UniqueName: \"kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.856632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.858008 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.858320 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.878691 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" event={"ID":"e2338d00-a33d-4b4d-8686-064b95e39943","Type":"ContainerStarted","Data":"f9f94b55ba776816bda0ce898bdea565e4843d7596c715a485eaba0b3491be16"} Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.878729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" event={"ID":"e2338d00-a33d-4b4d-8686-064b95e39943","Type":"ContainerStarted","Data":"cf2b2936abf23c991c018d02c09b0fdc527707b4bb7aeb58701cda06ea9c463d"} Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.881139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerStarted","Data":"91b08e7fd4a16d7dfde34cb2bf6d5a940d441e902bdd591920d62f102f9e6ee8"} Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.885747 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.893456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzpdt\" (UniqueName: \"kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt\") pod \"redhat-operators-6z86w\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.894713 4895 generic.go:334] "Generic (PLEG): container finished" podID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerID="beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31" exitCode=0 Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.896172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerDied","Data":"beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31"} Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.896213 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerStarted","Data":"8b81a0a1feb8d5d2e1b34f5bf7cd7152aba9b373e841540c9de7bbb6b72efd29"} Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.920383 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-g8hxm" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.923879 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.943902 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" podStartSLOduration=75.943879192 podStartE2EDuration="1m15.943879192s" podCreationTimestamp="2026-03-20 13:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:45.930366499 +0000 UTC m=+125.440085475" watchObservedRunningTime="2026-03-20 13:23:45.943879192 +0000 UTC m=+125.453598158" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.958907 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:45 crc kubenswrapper[4895]: I0320 13:23:45.959058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.025782 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.034213 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.044174 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.062097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.062169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.062288 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.107330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.169117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.169228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.169274 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqht\" (UniqueName: \"kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.183533 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.270869 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.270943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqht\" (UniqueName: \"kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.271020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.271717 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.272055 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.293493 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqht\" (UniqueName: \"kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht\") pod \"redhat-operators-5g6zh\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.294079 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.368303 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.372252 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxdvl\" (UniqueName: \"kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl\") pod \"444aebdc-d867-44b7-9884-e0d89fea57d8\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.372297 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume\") pod \"444aebdc-d867-44b7-9884-e0d89fea57d8\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.372336 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume\") pod \"444aebdc-d867-44b7-9884-e0d89fea57d8\" (UID: \"444aebdc-d867-44b7-9884-e0d89fea57d8\") " Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.375540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "444aebdc-d867-44b7-9884-e0d89fea57d8" (UID: "444aebdc-d867-44b7-9884-e0d89fea57d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.378425 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "444aebdc-d867-44b7-9884-e0d89fea57d8" (UID: "444aebdc-d867-44b7-9884-e0d89fea57d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.378547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl" (OuterVolumeSpecName: "kube-api-access-bxdvl") pod "444aebdc-d867-44b7-9884-e0d89fea57d8" (UID: "444aebdc-d867-44b7-9884-e0d89fea57d8"). InnerVolumeSpecName "kube-api-access-bxdvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.393900 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:23:46 crc kubenswrapper[4895]: W0320 13:23:46.429917 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod485a7267_c39b_4b1e_95b1_075e868421ed.slice/crio-9431a6953ef5e0b1f45db6c0779276d8502cc96cdb2ea3a5113fc9126f89c67a WatchSource:0}: Error finding container 9431a6953ef5e0b1f45db6c0779276d8502cc96cdb2ea3a5113fc9126f89c67a: Status 404 returned error can't find the container with id 9431a6953ef5e0b1f45db6c0779276d8502cc96cdb2ea3a5113fc9126f89c67a Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.474044 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444aebdc-d867-44b7-9884-e0d89fea57d8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.474083 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxdvl\" (UniqueName: \"kubernetes.io/projected/444aebdc-d867-44b7-9884-e0d89fea57d8-kube-api-access-bxdvl\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.474097 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444aebdc-d867-44b7-9884-e0d89fea57d8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.635099 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.638854 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:23:46 crc kubenswrapper[4895]: E0320 13:23:46.639064 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444aebdc-d867-44b7-9884-e0d89fea57d8" containerName="collect-profiles" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.639075 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="444aebdc-d867-44b7-9884-e0d89fea57d8" containerName="collect-profiles" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.639174 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="444aebdc-d867-44b7-9884-e0d89fea57d8" containerName="collect-profiles" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.639528 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.642533 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.643565 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.663568 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:46 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:46 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:46 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.663904 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.663825 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.676713 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.676760 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.701186 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.777521 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.777571 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.777700 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.816539 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.818218 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:46 crc kubenswrapper[4895]: W0320 13:23:46.870349 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23478f5b_63b0_4a43_a716_9d22fad71c2c.slice/crio-a94c2c2dfad2b953c8ffd52a627ffbe838d285f944ce9e464702460fe4e6747f WatchSource:0}: Error finding container a94c2c2dfad2b953c8ffd52a627ffbe838d285f944ce9e464702460fe4e6747f: Status 404 returned error can't find the container with id a94c2c2dfad2b953c8ffd52a627ffbe838d285f944ce9e464702460fe4e6747f Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.905470 4895 generic.go:334] "Generic (PLEG): container finished" podID="485a7267-c39b-4b1e-95b1-075e868421ed" containerID="99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e" exitCode=0 Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.905536 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerDied","Data":"99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.905558 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerStarted","Data":"9431a6953ef5e0b1f45db6c0779276d8502cc96cdb2ea3a5113fc9126f89c67a"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.914871 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" event={"ID":"444aebdc-d867-44b7-9884-e0d89fea57d8","Type":"ContainerDied","Data":"6b0cb6b651111fa9c6fee7f40a295c34ff9686e1faa9e022a44884069e888a3d"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.914924 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0cb6b651111fa9c6fee7f40a295c34ff9686e1faa9e022a44884069e888a3d" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.915044 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h" Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.921137 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerStarted","Data":"a94c2c2dfad2b953c8ffd52a627ffbe838d285f944ce9e464702460fe4e6747f"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.922325 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3611a3fa-a507-4a5b-bfce-655ccac688c0","Type":"ContainerStarted","Data":"f26c4edf879c2f35c684703154fe3eb0beb18b549cd54b6efaf769d7c61613c7"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.931297 4895 generic.go:334] "Generic (PLEG): container finished" podID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerID="858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67" exitCode=0 Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.933531 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerDied","Data":"858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67"} Mar 20 13:23:46 crc kubenswrapper[4895]: I0320 13:23:46.982479 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.003352 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:23:47 crc kubenswrapper[4895]: E0320 13:23:47.141453 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:47 crc kubenswrapper[4895]: E0320 13:23:47.142761 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:47 crc kubenswrapper[4895]: E0320 13:23:47.146211 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:47 crc kubenswrapper[4895]: E0320 13:23:47.146297 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.375721 4895 ???:1] "http: TLS handshake error from 192.168.126.11:60262: no serving certificate available for the kubelet" Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.510945 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.641626 4895 patch_prober.go:28] interesting pod/router-default-5444994796-fgv7q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:23:47 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Mar 20 13:23:47 crc kubenswrapper[4895]: [+]process-running ok Mar 20 13:23:47 crc kubenswrapper[4895]: healthz check failed Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.641679 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fgv7q" podUID="2fe0311f-3e9d-4749-b06c-a28d7d889c45" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.810760 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.952707 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3611a3fa-a507-4a5b-bfce-655ccac688c0","Type":"ContainerStarted","Data":"4afe15461af7de00763ff6416cf08d3682ccc9fadfe91f7908ed432ee7e74b05"} Mar 20 13:23:47 crc kubenswrapper[4895]: I0320 13:23:47.969815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ade8a205-5302-4452-9c51-f3a9053f0c03","Type":"ContainerStarted","Data":"c84072a647ce9ea6fc3707c368242dc10550a8968ee0dec59e31e772745a489e"} Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.651522 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.661103 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fgv7q" Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.992969 4895 generic.go:334] "Generic (PLEG): container finished" podID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerID="02d37e2a658a62e476ec57f75a75370f33365181af42940b24d1cf8d5947e1ff" exitCode=0 Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.993035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerDied","Data":"02d37e2a658a62e476ec57f75a75370f33365181af42940b24d1cf8d5947e1ff"} Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.998379 4895 generic.go:334] "Generic (PLEG): container finished" podID="3611a3fa-a507-4a5b-bfce-655ccac688c0" containerID="4afe15461af7de00763ff6416cf08d3682ccc9fadfe91f7908ed432ee7e74b05" exitCode=0 Mar 20 13:23:48 crc kubenswrapper[4895]: I0320 13:23:48.998499 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3611a3fa-a507-4a5b-bfce-655ccac688c0","Type":"ContainerDied","Data":"4afe15461af7de00763ff6416cf08d3682ccc9fadfe91f7908ed432ee7e74b05"} Mar 20 13:23:49 crc kubenswrapper[4895]: I0320 13:23:49.001297 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ade8a205-5302-4452-9c51-f3a9053f0c03","Type":"ContainerStarted","Data":"fae31f3133695e43910b3b5c8b15ecc9cbfe30ae8c69a52f44ee5fd006fa6bc4"} Mar 20 13:23:49 crc kubenswrapper[4895]: I0320 13:23:49.028578 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.028561952 podStartE2EDuration="3.028561952s" podCreationTimestamp="2026-03-20 13:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:49.025569047 +0000 UTC m=+128.535288013" watchObservedRunningTime="2026-03-20 13:23:49.028561952 +0000 UTC m=+128.538280918" Mar 20 13:23:49 crc kubenswrapper[4895]: I0320 13:23:49.397767 4895 ???:1] "http: TLS handshake error from 192.168.126.11:54906: no serving certificate available for the kubelet" Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.010342 4895 generic.go:334] "Generic (PLEG): container finished" podID="ade8a205-5302-4452-9c51-f3a9053f0c03" containerID="fae31f3133695e43910b3b5c8b15ecc9cbfe30ae8c69a52f44ee5fd006fa6bc4" exitCode=0 Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.011191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ade8a205-5302-4452-9c51-f3a9053f0c03","Type":"ContainerDied","Data":"fae31f3133695e43910b3b5c8b15ecc9cbfe30ae8c69a52f44ee5fd006fa6bc4"} Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.380815 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.539579 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir\") pod \"3611a3fa-a507-4a5b-bfce-655ccac688c0\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.539746 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3611a3fa-a507-4a5b-bfce-655ccac688c0" (UID: "3611a3fa-a507-4a5b-bfce-655ccac688c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.539852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access\") pod \"3611a3fa-a507-4a5b-bfce-655ccac688c0\" (UID: \"3611a3fa-a507-4a5b-bfce-655ccac688c0\") " Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.540176 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3611a3fa-a507-4a5b-bfce-655ccac688c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.549566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3611a3fa-a507-4a5b-bfce-655ccac688c0" (UID: "3611a3fa-a507-4a5b-bfce-655ccac688c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:23:50 crc kubenswrapper[4895]: I0320 13:23:50.642515 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3611a3fa-a507-4a5b-bfce-655ccac688c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:23:51 crc kubenswrapper[4895]: I0320 13:23:51.086689 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:23:51 crc kubenswrapper[4895]: I0320 13:23:51.088509 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3611a3fa-a507-4a5b-bfce-655ccac688c0","Type":"ContainerDied","Data":"f26c4edf879c2f35c684703154fe3eb0beb18b549cd54b6efaf769d7c61613c7"} Mar 20 13:23:51 crc kubenswrapper[4895]: I0320 13:23:51.088550 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f26c4edf879c2f35c684703154fe3eb0beb18b549cd54b6efaf769d7c61613c7" Mar 20 13:23:51 crc kubenswrapper[4895]: I0320 13:23:51.855996 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7dpsb" Mar 20 13:23:52 crc kubenswrapper[4895]: I0320 13:23:52.359700 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.524936 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.524969 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcm6r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.525012 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.525028 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mcm6r" podUID="ed200aaa-4ed3-4e46-a934-9e97a94e0738" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.812117 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:55 crc kubenswrapper[4895]: I0320 13:23:55.817083 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:23:57 crc kubenswrapper[4895]: E0320 13:23:57.141612 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:57 crc kubenswrapper[4895]: E0320 13:23:57.143480 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:57 crc kubenswrapper[4895]: E0320 13:23:57.145313 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:23:57 crc kubenswrapper[4895]: E0320 13:23:57.145501 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:23:58 crc kubenswrapper[4895]: I0320 13:23:58.833647 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:23:58 crc kubenswrapper[4895]: I0320 13:23:58.833893 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" containerID="cri-o://2b39e932f3dd0fab27b13cb44927f68962935e581261b327e1a6e8d81d5d3daa" gracePeriod=30 Mar 20 13:23:58 crc kubenswrapper[4895]: I0320 13:23:58.850584 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:23:58 crc kubenswrapper[4895]: I0320 13:23:58.850793 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" containerID="cri-o://b2bb5df726ff567d2d891091fbd958775333b8f2fa054d7c482d19de6d1200da" gracePeriod=30 Mar 20 13:23:59 crc kubenswrapper[4895]: I0320 13:23:59.144361 4895 generic.go:334] "Generic (PLEG): container finished" podID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerID="b2bb5df726ff567d2d891091fbd958775333b8f2fa054d7c482d19de6d1200da" exitCode=0 Mar 20 13:23:59 crc kubenswrapper[4895]: I0320 13:23:59.144436 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" event={"ID":"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff","Type":"ContainerDied","Data":"b2bb5df726ff567d2d891091fbd958775333b8f2fa054d7c482d19de6d1200da"} Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.128825 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566884-dxd7s"] Mar 20 13:24:00 crc kubenswrapper[4895]: E0320 13:24:00.129088 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3611a3fa-a507-4a5b-bfce-655ccac688c0" containerName="pruner" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.129103 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3611a3fa-a507-4a5b-bfce-655ccac688c0" containerName="pruner" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.129235 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3611a3fa-a507-4a5b-bfce-655ccac688c0" containerName="pruner" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.129661 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.132243 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.132783 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.142525 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-dxd7s"] Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.144335 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.154587 4895 generic.go:334] "Generic (PLEG): container finished" podID="931af0a9-a596-4b43-b788-107bb6d266ce" containerID="2b39e932f3dd0fab27b13cb44927f68962935e581261b327e1a6e8d81d5d3daa" exitCode=0 Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.154629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" event={"ID":"931af0a9-a596-4b43-b788-107bb6d266ce","Type":"ContainerDied","Data":"2b39e932f3dd0fab27b13cb44927f68962935e581261b327e1a6e8d81d5d3daa"} Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.292484 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxt9\" (UniqueName: \"kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9\") pod \"auto-csr-approver-29566884-dxd7s\" (UID: \"961b2d9b-3350-4f85-98af-412ea452ae83\") " pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.393429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxt9\" (UniqueName: \"kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9\") pod \"auto-csr-approver-29566884-dxd7s\" (UID: \"961b2d9b-3350-4f85-98af-412ea452ae83\") " pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.416481 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxt9\" (UniqueName: \"kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9\") pod \"auto-csr-approver-29566884-dxd7s\" (UID: \"961b2d9b-3350-4f85-98af-412ea452ae83\") " pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.444345 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:00 crc kubenswrapper[4895]: I0320 13:24:00.916881 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.003220 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access\") pod \"ade8a205-5302-4452-9c51-f3a9053f0c03\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.003366 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir\") pod \"ade8a205-5302-4452-9c51-f3a9053f0c03\" (UID: \"ade8a205-5302-4452-9c51-f3a9053f0c03\") " Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.003663 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ade8a205-5302-4452-9c51-f3a9053f0c03" (UID: "ade8a205-5302-4452-9c51-f3a9053f0c03"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.008530 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ade8a205-5302-4452-9c51-f3a9053f0c03" (UID: "ade8a205-5302-4452-9c51-f3a9053f0c03"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.030119 4895 patch_prober.go:28] interesting pod/route-controller-manager-5b59ddbb88-29t8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.030175 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.104907 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ade8a205-5302-4452-9c51-f3a9053f0c03-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.104935 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ade8a205-5302-4452-9c51-f3a9053f0c03-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.160739 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ade8a205-5302-4452-9c51-f3a9053f0c03","Type":"ContainerDied","Data":"c84072a647ce9ea6fc3707c368242dc10550a8968ee0dec59e31e772745a489e"} Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.160773 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c84072a647ce9ea6fc3707c368242dc10550a8968ee0dec59e31e772745a489e" Mar 20 13:24:01 crc kubenswrapper[4895]: I0320 13:24:01.160834 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:03 crc kubenswrapper[4895]: I0320 13:24:03.035113 4895 patch_prober.go:28] interesting pod/controller-manager-8676c49c64-p42fk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 20 13:24:03 crc kubenswrapper[4895]: I0320 13:24:03.035520 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 20 13:24:05 crc kubenswrapper[4895]: I0320 13:24:05.044116 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:24:05 crc kubenswrapper[4895]: I0320 13:24:05.532467 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mcm6r" Mar 20 13:24:07 crc kubenswrapper[4895]: E0320 13:24:07.142021 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:07 crc kubenswrapper[4895]: E0320 13:24:07.143582 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:07 crc kubenswrapper[4895]: E0320 13:24:07.144921 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:07 crc kubenswrapper[4895]: E0320 13:24:07.144953 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:08 crc kubenswrapper[4895]: I0320 13:24:08.232781 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:24:08 crc kubenswrapper[4895]: I0320 13:24:08.235552 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:24:09 crc kubenswrapper[4895]: I0320 13:24:09.911161 4895 ???:1] "http: TLS handshake error from 192.168.126.11:41332: no serving certificate available for the kubelet" Mar 20 13:24:11 crc kubenswrapper[4895]: I0320 13:24:11.268239 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.268220299 podStartE2EDuration="3.268220299s" podCreationTimestamp="2026-03-20 13:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:11.250780921 +0000 UTC m=+150.760499887" watchObservedRunningTime="2026-03-20 13:24:11.268220299 +0000 UTC m=+150.777939265" Mar 20 13:24:11 crc kubenswrapper[4895]: I0320 13:24:11.269132 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=3.269126825 podStartE2EDuration="3.269126825s" podCreationTimestamp="2026-03-20 13:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:11.267504851 +0000 UTC m=+150.777223817" watchObservedRunningTime="2026-03-20 13:24:11.269126825 +0000 UTC m=+150.778845791" Mar 20 13:24:12 crc kubenswrapper[4895]: I0320 13:24:12.031946 4895 patch_prober.go:28] interesting pod/route-controller-manager-5b59ddbb88-29t8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:12 crc kubenswrapper[4895]: I0320 13:24:12.032028 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:14 crc kubenswrapper[4895]: I0320 13:24:14.035803 4895 patch_prober.go:28] interesting pod/controller-manager-8676c49c64-p42fk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: i/o timeout" start-of-body= Mar 20 13:24:14 crc kubenswrapper[4895]: I0320 13:24:14.036118 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: i/o timeout" Mar 20 13:24:15 crc kubenswrapper[4895]: I0320 13:24:15.250627 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-gm5sr_11c4b94b-775c-473d-9c77-6597504fb4c8/kube-multus-additional-cni-plugins/0.log" Mar 20 13:24:15 crc kubenswrapper[4895]: I0320 13:24:15.250669 4895 generic.go:334] "Generic (PLEG): container finished" podID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" exitCode=137 Mar 20 13:24:15 crc kubenswrapper[4895]: I0320 13:24:15.250697 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" event={"ID":"11c4b94b-775c-473d-9c77-6597504fb4c8","Type":"ContainerDied","Data":"5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de"} Mar 20 13:24:15 crc kubenswrapper[4895]: E0320 13:24:15.526445 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:24:15 crc kubenswrapper[4895]: E0320 13:24:15.526640 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6cw9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tdcr2_openshift-marketplace(c9cbc624-2052-45bd-9d34-9cb03e70343c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:15 crc kubenswrapper[4895]: E0320 13:24:15.527817 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tdcr2" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" Mar 20 13:24:16 crc kubenswrapper[4895]: I0320 13:24:16.811941 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hxxm" Mar 20 13:24:17 crc kubenswrapper[4895]: E0320 13:24:17.140557 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:17 crc kubenswrapper[4895]: E0320 13:24:17.141230 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:17 crc kubenswrapper[4895]: E0320 13:24:17.141751 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:17 crc kubenswrapper[4895]: E0320 13:24:17.141845 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.050744 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:24:18 crc kubenswrapper[4895]: E0320 13:24:18.051048 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade8a205-5302-4452-9c51-f3a9053f0c03" containerName="pruner" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.051064 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade8a205-5302-4452-9c51-f3a9053f0c03" containerName="pruner" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.051219 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade8a205-5302-4452-9c51-f3a9053f0c03" containerName="pruner" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.051689 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.055260 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.055300 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.057853 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.146516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.146586 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.248194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.248367 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.248433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.282893 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:18 crc kubenswrapper[4895]: I0320 13:24:18.399107 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:20 crc kubenswrapper[4895]: I0320 13:24:20.241623 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:20 crc kubenswrapper[4895]: E0320 13:24:20.468347 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tdcr2" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.030742 4895 patch_prober.go:28] interesting pod/route-controller-manager-5b59ddbb88-29t8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.030812 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.216301 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.216567 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvrcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gn8m7_openshift-marketplace(f7d9f9c9-84fa-40b3-95fe-dd2f821c1262): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.217929 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gn8m7" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.246300 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.247952 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.261501 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.305007 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.305058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.305278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.405839 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.405876 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.405921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.405968 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.406029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.426948 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access\") pod \"installer-9-crc\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: I0320 13:24:22.582623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.587045 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.587164 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p84sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kd8jz_openshift-marketplace(5766f81f-890e-44ae-bef1-fe0335b631a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:22 crc kubenswrapper[4895]: E0320 13:24:22.588551 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kd8jz" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.540505 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gn8m7" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.540650 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kd8jz" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.560207 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.560370 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9h6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ltcwm_openshift-marketplace(50352717-2200-417f-b1ff-7e9adbe0cbf8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.561605 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ltcwm" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.613649 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.613806 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ct47f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2qwtb_openshift-marketplace(a71969a9-97c9-46c4-9e1c-051f3c86ae91): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.614961 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2qwtb" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.617249 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.621617 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.645560 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.646132 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.646147 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.646171 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.646181 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.646302 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" containerName="route-controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.646320 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.647210 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.653866 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.659253 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.660722 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghwhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qpm8f_openshift-marketplace(4e70e99c-ccbe-4290-ad2e-20f42e5bde4b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:23 crc kubenswrapper[4895]: E0320 13:24:23.662164 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qpm8f" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.721850 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca\") pod \"931af0a9-a596-4b43-b788-107bb6d266ce\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.721912 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config\") pod \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.721941 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles\") pod \"931af0a9-a596-4b43-b788-107bb6d266ce\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.721980 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca\") pod \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722025 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcd5w\" (UniqueName: \"kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w\") pod \"931af0a9-a596-4b43-b788-107bb6d266ce\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722054 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config\") pod \"931af0a9-a596-4b43-b788-107bb6d266ce\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722085 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert\") pod \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722108 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdtzt\" (UniqueName: \"kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt\") pod \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\" (UID: \"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722130 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert\") pod \"931af0a9-a596-4b43-b788-107bb6d266ce\" (UID: \"931af0a9-a596-4b43-b788-107bb6d266ce\") " Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722478 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722513 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7xk\" (UniqueName: \"kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722559 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722579 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "931af0a9-a596-4b43-b788-107bb6d266ce" (UID: "931af0a9-a596-4b43-b788-107bb6d266ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.722698 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.723059 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config" (OuterVolumeSpecName: "config") pod "931af0a9-a596-4b43-b788-107bb6d266ce" (UID: "931af0a9-a596-4b43-b788-107bb6d266ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.724284 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" (UID: "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.724275 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config" (OuterVolumeSpecName: "config") pod "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" (UID: "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.724332 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "931af0a9-a596-4b43-b788-107bb6d266ce" (UID: "931af0a9-a596-4b43-b788-107bb6d266ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.727565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "931af0a9-a596-4b43-b788-107bb6d266ce" (UID: "931af0a9-a596-4b43-b788-107bb6d266ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.727627 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" (UID: "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.727676 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt" (OuterVolumeSpecName: "kube-api-access-wdtzt") pod "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" (UID: "0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff"). InnerVolumeSpecName "kube-api-access-wdtzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.734851 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w" (OuterVolumeSpecName: "kube-api-access-vcd5w") pod "931af0a9-a596-4b43-b788-107bb6d266ce" (UID: "931af0a9-a596-4b43-b788-107bb6d266ce"). InnerVolumeSpecName "kube-api-access-vcd5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824125 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824221 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7xk\" (UniqueName: \"kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824355 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcd5w\" (UniqueName: \"kubernetes.io/projected/931af0a9-a596-4b43-b788-107bb6d266ce-kube-api-access-vcd5w\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824371 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824382 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824410 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdtzt\" (UniqueName: \"kubernetes.io/projected/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-kube-api-access-wdtzt\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824420 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931af0a9-a596-4b43-b788-107bb6d266ce-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824432 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824442 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/931af0a9-a596-4b43-b788-107bb6d266ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.824452 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.825111 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.825636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.829609 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.842251 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7xk\" (UniqueName: \"kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk\") pod \"route-controller-manager-c6947d9b-r2p5w\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:23 crc kubenswrapper[4895]: I0320 13:24:23.972446 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.035809 4895 patch_prober.go:28] interesting pod/controller-manager-8676c49c64-p42fk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.035875 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.440991 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" event={"ID":"0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff","Type":"ContainerDied","Data":"6e7723e3189b95b393782039e8aa3c1936d68b7f761f1f6a937afd820c78286d"} Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.441049 4895 scope.go:117] "RemoveContainer" containerID="b2bb5df726ff567d2d891091fbd958775333b8f2fa054d7c482d19de6d1200da" Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.441057 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x" Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.442581 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" event={"ID":"931af0a9-a596-4b43-b788-107bb6d266ce","Type":"ContainerDied","Data":"cd991533fb4b9d0dcfb99663a400fccb72ebab2b8003aa9fe8c1ea6668a329d7"} Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.442658 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8676c49c64-p42fk" Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.510129 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.516753 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b59ddbb88-29t8x"] Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.519779 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:24:24 crc kubenswrapper[4895]: I0320 13:24:24.522724 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8676c49c64-p42fk"] Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.222672 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff" path="/var/lib/kubelet/pods/0aa5fe02-05bc-4cd6-bf59-6d595a5ca5ff/volumes" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.224851 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931af0a9-a596-4b43-b788-107bb6d266ce" path="/var/lib/kubelet/pods/931af0a9-a596-4b43-b788-107bb6d266ce/volumes" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.953698 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.954727 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.956461 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.956919 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.958351 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.960383 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.960606 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.962594 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.965355 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:25 crc kubenswrapper[4895]: I0320 13:24:25.966458 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.052447 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.052500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdmtv\" (UniqueName: \"kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.052580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.052659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.052770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.153800 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.153870 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.153892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdmtv\" (UniqueName: \"kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.153913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.154092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.154906 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.155097 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.157733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.160259 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.170961 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdmtv\" (UniqueName: \"kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv\") pod \"controller-manager-5879558d4d-sksws\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:26 crc kubenswrapper[4895]: I0320 13:24:26.278809 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.139861 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.140553 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.140758 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.140793 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.206151 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2qwtb" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.206178 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ltcwm" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.206675 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qpm8f" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.240064 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.240193 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgqht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5g6zh_openshift-marketplace(23478f5b-63b0-4a43-a716-9d22fad71c2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.242355 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5g6zh" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.245923 4895 scope.go:117] "RemoveContainer" containerID="2b39e932f3dd0fab27b13cb44927f68962935e581261b327e1a6e8d81d5d3daa" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.261179 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.261334 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzpdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6z86w_openshift-marketplace(485a7267-c39b-4b1e-95b1-075e868421ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.262897 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6z86w" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.319240 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-gm5sr_11c4b94b-775c-473d-9c77-6597504fb4c8/kube-multus-additional-cni-plugins/0.log" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.319352 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.369118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir\") pod \"11c4b94b-775c-473d-9c77-6597504fb4c8\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.370193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist\") pod \"11c4b94b-775c-473d-9c77-6597504fb4c8\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.370382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmrj6\" (UniqueName: \"kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6\") pod \"11c4b94b-775c-473d-9c77-6597504fb4c8\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.370449 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready\") pod \"11c4b94b-775c-473d-9c77-6597504fb4c8\" (UID: \"11c4b94b-775c-473d-9c77-6597504fb4c8\") " Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.371521 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready" (OuterVolumeSpecName: "ready") pod "11c4b94b-775c-473d-9c77-6597504fb4c8" (UID: "11c4b94b-775c-473d-9c77-6597504fb4c8"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.372048 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "11c4b94b-775c-473d-9c77-6597504fb4c8" (UID: "11c4b94b-775c-473d-9c77-6597504fb4c8"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.372110 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "11c4b94b-775c-473d-9c77-6597504fb4c8" (UID: "11c4b94b-775c-473d-9c77-6597504fb4c8"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.382830 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6" (OuterVolumeSpecName: "kube-api-access-kmrj6") pod "11c4b94b-775c-473d-9c77-6597504fb4c8" (UID: "11c4b94b-775c-473d-9c77-6597504fb4c8"). InnerVolumeSpecName "kube-api-access-kmrj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.464239 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-gm5sr_11c4b94b-775c-473d-9c77-6597504fb4c8/kube-multus-additional-cni-plugins/0.log" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.464564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" event={"ID":"11c4b94b-775c-473d-9c77-6597504fb4c8","Type":"ContainerDied","Data":"c28376bf917ac2521c98b3a5ccb5aea71c4169582ff8c7e87ff64ea03fecd5b3"} Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.464700 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-gm5sr" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.464710 4895 scope.go:117] "RemoveContainer" containerID="5a9fe9056f5fa83294e423c940963b96b6a78d640601ffd49be2f2c362ceb9de" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.466408 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5g6zh" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" Mar 20 13:24:27 crc kubenswrapper[4895]: E0320 13:24:27.466899 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6z86w" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.471813 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmrj6\" (UniqueName: \"kubernetes.io/projected/11c4b94b-775c-473d-9c77-6597504fb4c8-kube-api-access-kmrj6\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.471842 4895 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/11c4b94b-775c-473d-9c77-6597504fb4c8-ready\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.471854 4895 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11c4b94b-775c-473d-9c77-6597504fb4c8-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.471867 4895 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/11c4b94b-775c-473d-9c77-6597504fb4c8-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.507881 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-gm5sr"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.511777 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-gm5sr"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.679197 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.683108 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-dxd7s"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.772826 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.776947 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:27 crc kubenswrapper[4895]: I0320 13:24:27.782476 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.471846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" event={"ID":"961b2d9b-3350-4f85-98af-412ea452ae83","Type":"ContainerStarted","Data":"cebd6ea83def41d499aac4f21a87260d4a79b0bc725e1a0038c9211b997012ca"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.474240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" event={"ID":"7e6a78db-29cd-4be1-bf2d-dba3047f5f41","Type":"ContainerStarted","Data":"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.474309 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" event={"ID":"7e6a78db-29cd-4be1-bf2d-dba3047f5f41","Type":"ContainerStarted","Data":"3aa13e6a695f648c560b354e9a0d6fdcff4b9023bb4699ae73b049ff98aa260c"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.474613 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.476626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a4c1431f-0be0-47f1-b86f-7a41d5015305","Type":"ContainerStarted","Data":"7049025af6f5e79b7ea415036ed543f2828dfe538b6af607b13db0a4608d6741"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.476673 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a4c1431f-0be0-47f1-b86f-7a41d5015305","Type":"ContainerStarted","Data":"c9d26a79e7693da8d798cca94d5e65094a8b98a57c2581035405c14f2a8f1106"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.479565 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c678c41f-b1ad-4644-95ad-ad253fab8223","Type":"ContainerStarted","Data":"769f9d9ba60b813b0ca85896ea4574025fd23bee781de6e0930a2f5077768ba4"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.479600 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c678c41f-b1ad-4644-95ad-ad253fab8223","Type":"ContainerStarted","Data":"703e32296c581886a5893f4fb24525ec7ae5d07851fc7097739b555bbb82d602"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.481150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" event={"ID":"2e015edd-162e-4f4c-9491-e8eace1c4ce7","Type":"ContainerStarted","Data":"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.481193 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" event={"ID":"2e015edd-162e-4f4c-9491-e8eace1c4ce7","Type":"ContainerStarted","Data":"0c60066a033fe8a71fe0bd8322e14c04646c54deb0d722ce956cd81b39aecc6f"} Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.481651 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.484227 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.499190 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" podStartSLOduration=10.499171702 podStartE2EDuration="10.499171702s" podCreationTimestamp="2026-03-20 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:28.49524646 +0000 UTC m=+168.004965456" watchObservedRunningTime="2026-03-20 13:24:28.499171702 +0000 UTC m=+168.008890668" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.516068 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" podStartSLOduration=10.516048817 podStartE2EDuration="10.516048817s" podCreationTimestamp="2026-03-20 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:28.514544909 +0000 UTC m=+168.024263875" watchObservedRunningTime="2026-03-20 13:24:28.516048817 +0000 UTC m=+168.025767793" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.552997 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.552978163 podStartE2EDuration="10.552978163s" podCreationTimestamp="2026-03-20 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:28.551006536 +0000 UTC m=+168.060725512" watchObservedRunningTime="2026-03-20 13:24:28.552978163 +0000 UTC m=+168.062697129" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.599113 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.599091774 podStartE2EDuration="6.599091774s" podCreationTimestamp="2026-03-20 13:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:28.597661219 +0000 UTC m=+168.107380185" watchObservedRunningTime="2026-03-20 13:24:28.599091774 +0000 UTC m=+168.108810750" Mar 20 13:24:28 crc kubenswrapper[4895]: I0320 13:24:28.867707 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:29 crc kubenswrapper[4895]: I0320 13:24:29.224954 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" path="/var/lib/kubelet/pods/11c4b94b-775c-473d-9c77-6597504fb4c8/volumes" Mar 20 13:24:29 crc kubenswrapper[4895]: I0320 13:24:29.491406 4895 generic.go:334] "Generic (PLEG): container finished" podID="c678c41f-b1ad-4644-95ad-ad253fab8223" containerID="769f9d9ba60b813b0ca85896ea4574025fd23bee781de6e0930a2f5077768ba4" exitCode=0 Mar 20 13:24:29 crc kubenswrapper[4895]: I0320 13:24:29.491471 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c678c41f-b1ad-4644-95ad-ad253fab8223","Type":"ContainerDied","Data":"769f9d9ba60b813b0ca85896ea4574025fd23bee781de6e0930a2f5077768ba4"} Mar 20 13:24:33 crc kubenswrapper[4895]: I0320 13:24:33.970295 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.066204 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access\") pod \"c678c41f-b1ad-4644-95ad-ad253fab8223\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.066301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir\") pod \"c678c41f-b1ad-4644-95ad-ad253fab8223\" (UID: \"c678c41f-b1ad-4644-95ad-ad253fab8223\") " Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.066762 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c678c41f-b1ad-4644-95ad-ad253fab8223" (UID: "c678c41f-b1ad-4644-95ad-ad253fab8223"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.085968 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c678c41f-b1ad-4644-95ad-ad253fab8223" (UID: "c678c41f-b1ad-4644-95ad-ad253fab8223"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.168301 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c678c41f-b1ad-4644-95ad-ad253fab8223-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.168361 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c678c41f-b1ad-4644-95ad-ad253fab8223-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.520554 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c678c41f-b1ad-4644-95ad-ad253fab8223","Type":"ContainerDied","Data":"703e32296c581886a5893f4fb24525ec7ae5d07851fc7097739b555bbb82d602"} Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.520867 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703e32296c581886a5893f4fb24525ec7ae5d07851fc7097739b555bbb82d602" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.520595 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.522308 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerStarted","Data":"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c"} Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.928751 4895 csr.go:261] certificate signing request csr-fk6m7 is approved, waiting to be issued Mar 20 13:24:34 crc kubenswrapper[4895]: I0320 13:24:34.934803 4895 csr.go:257] certificate signing request csr-fk6m7 is issued Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.529205 4895 generic.go:334] "Generic (PLEG): container finished" podID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerID="dd8520c47b8c4c9cff3097da1826b0456cd2edb41b296f47a581cd2ca5ea23f2" exitCode=0 Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.529280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerDied","Data":"dd8520c47b8c4c9cff3097da1826b0456cd2edb41b296f47a581cd2ca5ea23f2"} Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.531737 4895 generic.go:334] "Generic (PLEG): container finished" podID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerID="cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c" exitCode=0 Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.531782 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerDied","Data":"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c"} Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.535806 4895 generic.go:334] "Generic (PLEG): container finished" podID="961b2d9b-3350-4f85-98af-412ea452ae83" containerID="0d513000bd226a761adeeec48f2a23b06935898f09b024db232fc20f5127f7b1" exitCode=0 Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.535838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" event={"ID":"961b2d9b-3350-4f85-98af-412ea452ae83","Type":"ContainerDied","Data":"0d513000bd226a761adeeec48f2a23b06935898f09b024db232fc20f5127f7b1"} Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.936542 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 10:14:45.26133511 +0000 UTC Mar 20 13:24:35 crc kubenswrapper[4895]: I0320 13:24:35.936577 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6596h50m9.324761246s for next certificate rotation Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.543375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerStarted","Data":"3acd8b90d15c3c486efcb03e9a62a588ec38afd45d8b9d4987d51ebd4a28c43e"} Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.545290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerStarted","Data":"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e"} Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.559764 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kd8jz" podStartSLOduration=3.405357996 podStartE2EDuration="54.559746745s" podCreationTimestamp="2026-03-20 13:23:42 +0000 UTC" firstStartedPulling="2026-03-20 13:23:44.831642798 +0000 UTC m=+124.341361754" lastFinishedPulling="2026-03-20 13:24:35.986031537 +0000 UTC m=+175.495750503" observedRunningTime="2026-03-20 13:24:36.557351063 +0000 UTC m=+176.067070029" watchObservedRunningTime="2026-03-20 13:24:36.559746745 +0000 UTC m=+176.069465711" Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.575958 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tdcr2" podStartSLOduration=3.500664112 podStartE2EDuration="54.575940225s" podCreationTimestamp="2026-03-20 13:23:42 +0000 UTC" firstStartedPulling="2026-03-20 13:23:44.842609996 +0000 UTC m=+124.352328962" lastFinishedPulling="2026-03-20 13:24:35.917886109 +0000 UTC m=+175.427605075" observedRunningTime="2026-03-20 13:24:36.573130746 +0000 UTC m=+176.082849712" watchObservedRunningTime="2026-03-20 13:24:36.575940225 +0000 UTC m=+176.085659191" Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.859811 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.917941 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxt9\" (UniqueName: \"kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9\") pod \"961b2d9b-3350-4f85-98af-412ea452ae83\" (UID: \"961b2d9b-3350-4f85-98af-412ea452ae83\") " Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.923637 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9" (OuterVolumeSpecName: "kube-api-access-9bxt9") pod "961b2d9b-3350-4f85-98af-412ea452ae83" (UID: "961b2d9b-3350-4f85-98af-412ea452ae83"). InnerVolumeSpecName "kube-api-access-9bxt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.936716 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-20 22:55:30.002691105 +0000 UTC Mar 20 13:24:36 crc kubenswrapper[4895]: I0320 13:24:36.936755 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7353h30m53.065938662s for next certificate rotation Mar 20 13:24:37 crc kubenswrapper[4895]: I0320 13:24:37.018880 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bxt9\" (UniqueName: \"kubernetes.io/projected/961b2d9b-3350-4f85-98af-412ea452ae83-kube-api-access-9bxt9\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:37 crc kubenswrapper[4895]: I0320 13:24:37.552719 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" event={"ID":"961b2d9b-3350-4f85-98af-412ea452ae83","Type":"ContainerDied","Data":"cebd6ea83def41d499aac4f21a87260d4a79b0bc725e1a0038c9211b997012ca"} Mar 20 13:24:37 crc kubenswrapper[4895]: I0320 13:24:37.552779 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cebd6ea83def41d499aac4f21a87260d4a79b0bc725e1a0038c9211b997012ca" Mar 20 13:24:37 crc kubenswrapper[4895]: I0320 13:24:37.552785 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-dxd7s" Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.567125 4895 generic.go:334] "Generic (PLEG): container finished" podID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerID="aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24" exitCode=0 Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.567195 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerDied","Data":"aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24"} Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.807359 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.807613 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" podUID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" containerName="controller-manager" containerID="cri-o://8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d" gracePeriod=30 Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.842583 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:38 crc kubenswrapper[4895]: I0320 13:24:38.842787 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" podUID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" containerName="route-controller-manager" containerID="cri-o://9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977" gracePeriod=30 Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.346502 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.445058 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g7xk\" (UniqueName: \"kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk\") pod \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.445216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert\") pod \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.445291 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config\") pod \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.445347 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca\") pod \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\" (UID: \"2e015edd-162e-4f4c-9491-e8eace1c4ce7\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.445882 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e015edd-162e-4f4c-9491-e8eace1c4ce7" (UID: "2e015edd-162e-4f4c-9491-e8eace1c4ce7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.446002 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config" (OuterVolumeSpecName: "config") pod "2e015edd-162e-4f4c-9491-e8eace1c4ce7" (UID: "2e015edd-162e-4f4c-9491-e8eace1c4ce7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.446211 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.446230 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e015edd-162e-4f4c-9491-e8eace1c4ce7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.450039 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e015edd-162e-4f4c-9491-e8eace1c4ce7" (UID: "2e015edd-162e-4f4c-9491-e8eace1c4ce7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.450112 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk" (OuterVolumeSpecName: "kube-api-access-7g7xk") pod "2e015edd-162e-4f4c-9491-e8eace1c4ce7" (UID: "2e015edd-162e-4f4c-9491-e8eace1c4ce7"). InnerVolumeSpecName "kube-api-access-7g7xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.474729 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.546924 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca\") pod \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.546981 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config\") pod \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.547038 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles\") pod \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.547083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdmtv\" (UniqueName: \"kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv\") pod \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.547121 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert\") pod \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\" (UID: \"7e6a78db-29cd-4be1-bf2d-dba3047f5f41\") " Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.547443 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e015edd-162e-4f4c-9491-e8eace1c4ce7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.547462 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g7xk\" (UniqueName: \"kubernetes.io/projected/2e015edd-162e-4f4c-9491-e8eace1c4ce7-kube-api-access-7g7xk\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.548135 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e6a78db-29cd-4be1-bf2d-dba3047f5f41" (UID: "7e6a78db-29cd-4be1-bf2d-dba3047f5f41"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.548168 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config" (OuterVolumeSpecName: "config") pod "7e6a78db-29cd-4be1-bf2d-dba3047f5f41" (UID: "7e6a78db-29cd-4be1-bf2d-dba3047f5f41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.548467 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e6a78db-29cd-4be1-bf2d-dba3047f5f41" (UID: "7e6a78db-29cd-4be1-bf2d-dba3047f5f41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.550197 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv" (OuterVolumeSpecName: "kube-api-access-vdmtv") pod "7e6a78db-29cd-4be1-bf2d-dba3047f5f41" (UID: "7e6a78db-29cd-4be1-bf2d-dba3047f5f41"). InnerVolumeSpecName "kube-api-access-vdmtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.550437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e6a78db-29cd-4be1-bf2d-dba3047f5f41" (UID: "7e6a78db-29cd-4be1-bf2d-dba3047f5f41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.574972 4895 generic.go:334] "Generic (PLEG): container finished" podID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" containerID="9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977" exitCode=0 Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.575030 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" event={"ID":"2e015edd-162e-4f4c-9491-e8eace1c4ce7","Type":"ContainerDied","Data":"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977"} Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.575102 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" event={"ID":"2e015edd-162e-4f4c-9491-e8eace1c4ce7","Type":"ContainerDied","Data":"0c60066a033fe8a71fe0bd8322e14c04646c54deb0d722ce956cd81b39aecc6f"} Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.575131 4895 scope.go:117] "RemoveContainer" containerID="9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.575042 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.577656 4895 generic.go:334] "Generic (PLEG): container finished" podID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" containerID="8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d" exitCode=0 Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.577734 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.578317 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" event={"ID":"7e6a78db-29cd-4be1-bf2d-dba3047f5f41","Type":"ContainerDied","Data":"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d"} Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.578367 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5879558d4d-sksws" event={"ID":"7e6a78db-29cd-4be1-bf2d-dba3047f5f41","Type":"ContainerDied","Data":"3aa13e6a695f648c560b354e9a0d6fdcff4b9023bb4699ae73b049ff98aa260c"} Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.580316 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerStarted","Data":"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141"} Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.595152 4895 scope.go:117] "RemoveContainer" containerID="9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.596233 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977\": container with ID starting with 9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977 not found: ID does not exist" containerID="9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.596369 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977"} err="failed to get container status \"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977\": rpc error: code = NotFound desc = could not find container \"9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977\": container with ID starting with 9c77106e4ee22101bcee234d2954a8b4712f0c64f660412a8137fc52a1b5e977 not found: ID does not exist" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.596485 4895 scope.go:117] "RemoveContainer" containerID="8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.601452 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gn8m7" podStartSLOduration=2.492553848 podStartE2EDuration="55.601441345s" podCreationTimestamp="2026-03-20 13:23:44 +0000 UTC" firstStartedPulling="2026-03-20 13:23:45.908849533 +0000 UTC m=+125.418568489" lastFinishedPulling="2026-03-20 13:24:39.01773702 +0000 UTC m=+178.527455986" observedRunningTime="2026-03-20 13:24:39.600594423 +0000 UTC m=+179.110313399" watchObservedRunningTime="2026-03-20 13:24:39.601441345 +0000 UTC m=+179.111160321" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.614705 4895 scope.go:117] "RemoveContainer" containerID="8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.616528 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d\": container with ID starting with 8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d not found: ID does not exist" containerID="8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.616570 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d"} err="failed to get container status \"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d\": rpc error: code = NotFound desc = could not find container \"8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d\": container with ID starting with 8337d777c7d595290a7f26360d3720d518660c32e14c00268ef45763dc43507d not found: ID does not exist" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.623423 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.631036 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c6947d9b-r2p5w"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.640165 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.645417 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5879558d4d-sksws"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.649103 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.649132 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.649146 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.649159 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdmtv\" (UniqueName: \"kubernetes.io/projected/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-kube-api-access-vdmtv\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.649170 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e6a78db-29cd-4be1-bf2d-dba3047f5f41-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.966556 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.966988 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" containerName="route-controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967069 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" containerName="route-controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.967162 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="961b2d9b-3350-4f85-98af-412ea452ae83" containerName="oc" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967229 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="961b2d9b-3350-4f85-98af-412ea452ae83" containerName="oc" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.967317 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967382 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.967489 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c678c41f-b1ad-4644-95ad-ad253fab8223" containerName="pruner" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967555 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c678c41f-b1ad-4644-95ad-ad253fab8223" containerName="pruner" Mar 20 13:24:39 crc kubenswrapper[4895]: E0320 13:24:39.967621 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" containerName="controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967684 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" containerName="controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967869 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c4b94b-775c-473d-9c77-6597504fb4c8" containerName="kube-multus-additional-cni-plugins" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.967946 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" containerName="controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.968027 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" containerName="route-controller-manager" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.968116 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c678c41f-b1ad-4644-95ad-ad253fab8223" containerName="pruner" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.968201 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="961b2d9b-3350-4f85-98af-412ea452ae83" containerName="oc" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.968734 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.969978 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.970593 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.977112 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.980413 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.980888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.982457 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.984905 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.985125 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.985434 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.985588 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.985825 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.985906 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.986161 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.986265 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.986368 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.988459 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:24:39 crc kubenswrapper[4895]: I0320 13:24:39.991081 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054062 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054109 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlljw\" (UniqueName: \"kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054167 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054185 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054198 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054221 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt6gk\" (UniqueName: \"kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.054277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.155984 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156034 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156080 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt6gk\" (UniqueName: \"kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156142 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156167 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlljw\" (UniqueName: \"kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.156200 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.157326 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.158349 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.159459 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.159988 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.160953 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.164556 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.164869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.176890 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlljw\" (UniqueName: \"kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw\") pod \"route-controller-manager-bc8dfcc6b-kxl27\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.178564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt6gk\" (UniqueName: \"kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk\") pod \"controller-manager-67fc78d778-4j5nv\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.295774 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.315131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.590535 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerStarted","Data":"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407"} Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.837790 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:40 crc kubenswrapper[4895]: W0320 13:24:40.839564 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278c6500_c7e2_427a_a477_81179f04717d.slice/crio-6c7276e271afb450729a512afc47fce1b13f88e3cd858a2607c3c12ae5b29fb1 WatchSource:0}: Error finding container 6c7276e271afb450729a512afc47fce1b13f88e3cd858a2607c3c12ae5b29fb1: Status 404 returned error can't find the container with id 6c7276e271afb450729a512afc47fce1b13f88e3cd858a2607c3c12ae5b29fb1 Mar 20 13:24:40 crc kubenswrapper[4895]: I0320 13:24:40.850919 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:40 crc kubenswrapper[4895]: W0320 13:24:40.859704 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ae1250_46a0_4a18_b90b_c686cf9f796d.slice/crio-9c94cf28a4dc4d5fbbf2fcba49224832fdf8ab232bff0db4b736ac753d96c595 WatchSource:0}: Error finding container 9c94cf28a4dc4d5fbbf2fcba49224832fdf8ab232bff0db4b736ac753d96c595: Status 404 returned error can't find the container with id 9c94cf28a4dc4d5fbbf2fcba49224832fdf8ab232bff0db4b736ac753d96c595 Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.224148 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e015edd-162e-4f4c-9491-e8eace1c4ce7" path="/var/lib/kubelet/pods/2e015edd-162e-4f4c-9491-e8eace1c4ce7/volumes" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.226162 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6a78db-29cd-4be1-bf2d-dba3047f5f41" path="/var/lib/kubelet/pods/7e6a78db-29cd-4be1-bf2d-dba3047f5f41/volumes" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.603449 4895 generic.go:334] "Generic (PLEG): container finished" podID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerID="6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5" exitCode=0 Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.603562 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerDied","Data":"6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.605688 4895 generic.go:334] "Generic (PLEG): container finished" podID="485a7267-c39b-4b1e-95b1-075e868421ed" containerID="635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407" exitCode=0 Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.605779 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerDied","Data":"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.607563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" event={"ID":"278c6500-c7e2-427a-a477-81179f04717d","Type":"ContainerStarted","Data":"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.607623 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" event={"ID":"278c6500-c7e2-427a-a477-81179f04717d","Type":"ContainerStarted","Data":"6c7276e271afb450729a512afc47fce1b13f88e3cd858a2607c3c12ae5b29fb1"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.607708 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.609243 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" event={"ID":"77ae1250-46a0-4a18-b90b-c686cf9f796d","Type":"ContainerStarted","Data":"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.609285 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" event={"ID":"77ae1250-46a0-4a18-b90b-c686cf9f796d","Type":"ContainerStarted","Data":"9c94cf28a4dc4d5fbbf2fcba49224832fdf8ab232bff0db4b736ac753d96c595"} Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.610131 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.616686 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.626128 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.643812 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" podStartSLOduration=3.643794836 podStartE2EDuration="3.643794836s" podCreationTimestamp="2026-03-20 13:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:41.643037507 +0000 UTC m=+181.152756473" watchObservedRunningTime="2026-03-20 13:24:41.643794836 +0000 UTC m=+181.153513802" Mar 20 13:24:41 crc kubenswrapper[4895]: I0320 13:24:41.699168 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" podStartSLOduration=3.69914866 podStartE2EDuration="3.69914866s" podCreationTimestamp="2026-03-20 13:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:41.698985456 +0000 UTC m=+181.208704422" watchObservedRunningTime="2026-03-20 13:24:41.69914866 +0000 UTC m=+181.208867626" Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.618612 4895 generic.go:334] "Generic (PLEG): container finished" podID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerID="e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b" exitCode=0 Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.619004 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerDied","Data":"e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b"} Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.621771 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerStarted","Data":"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f"} Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.624630 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerStarted","Data":"fdf76b0110900f43a20951fb7ae272aaef7451684c4507b5331612ba2c95f6e6"} Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.768717 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:24:42 crc kubenswrapper[4895]: I0320 13:24:42.768770 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.222467 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.222818 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.361518 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.363477 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.632190 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerStarted","Data":"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191"} Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.633812 4895 generic.go:334] "Generic (PLEG): container finished" podID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerID="fdf76b0110900f43a20951fb7ae272aaef7451684c4507b5331612ba2c95f6e6" exitCode=0 Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.634497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerDied","Data":"fdf76b0110900f43a20951fb7ae272aaef7451684c4507b5331612ba2c95f6e6"} Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.709894 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ltcwm" podStartSLOduration=3.43998463 podStartE2EDuration="59.709871667s" podCreationTimestamp="2026-03-20 13:23:44 +0000 UTC" firstStartedPulling="2026-03-20 13:23:46.935334968 +0000 UTC m=+126.445053934" lastFinishedPulling="2026-03-20 13:24:43.205222005 +0000 UTC m=+182.714940971" observedRunningTime="2026-03-20 13:24:43.649949616 +0000 UTC m=+183.159668582" watchObservedRunningTime="2026-03-20 13:24:43.709871667 +0000 UTC m=+183.219590653" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.713829 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.726977 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6z86w" podStartSLOduration=3.322194905 podStartE2EDuration="58.726960576s" podCreationTimestamp="2026-03-20 13:23:45 +0000 UTC" firstStartedPulling="2026-03-20 13:23:46.911136303 +0000 UTC m=+126.420855259" lastFinishedPulling="2026-03-20 13:24:42.315901964 +0000 UTC m=+181.825620930" observedRunningTime="2026-03-20 13:24:43.726005381 +0000 UTC m=+183.235724367" watchObservedRunningTime="2026-03-20 13:24:43.726960576 +0000 UTC m=+183.236679542" Mar 20 13:24:43 crc kubenswrapper[4895]: I0320 13:24:43.733145 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:44 crc kubenswrapper[4895]: I0320 13:24:44.915742 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:24:44 crc kubenswrapper[4895]: I0320 13:24:44.916012 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:24:44 crc kubenswrapper[4895]: I0320 13:24:44.958442 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.340577 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.341207 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.640006 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.644997 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kd8jz" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="registry-server" containerID="cri-o://3acd8b90d15c3c486efcb03e9a62a588ec38afd45d8b9d4987d51ebd4a28c43e" gracePeriod=2 Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.681988 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.924521 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:24:45 crc kubenswrapper[4895]: I0320 13:24:45.924563 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:24:46 crc kubenswrapper[4895]: I0320 13:24:46.378195 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ltcwm" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="registry-server" probeResult="failure" output=< Mar 20 13:24:46 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:24:46 crc kubenswrapper[4895]: > Mar 20 13:24:46 crc kubenswrapper[4895]: I0320 13:24:46.966334 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z86w" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="registry-server" probeResult="failure" output=< Mar 20 13:24:46 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:24:46 crc kubenswrapper[4895]: > Mar 20 13:24:47 crc kubenswrapper[4895]: I0320 13:24:47.658113 4895 generic.go:334] "Generic (PLEG): container finished" podID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerID="3acd8b90d15c3c486efcb03e9a62a588ec38afd45d8b9d4987d51ebd4a28c43e" exitCode=0 Mar 20 13:24:47 crc kubenswrapper[4895]: I0320 13:24:47.658165 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerDied","Data":"3acd8b90d15c3c486efcb03e9a62a588ec38afd45d8b9d4987d51ebd4a28c43e"} Mar 20 13:24:47 crc kubenswrapper[4895]: I0320 13:24:47.661506 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerStarted","Data":"62c5b718a4a75f74c07414fec26fcd3038aae43c11a723774889806506c8cbbc"} Mar 20 13:24:47 crc kubenswrapper[4895]: I0320 13:24:47.663907 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerStarted","Data":"d38ced92836b8c43f7157bf6fb56e6855c09477fa3c5c16bdaa65eeada770679"} Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.536760 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.591966 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p84sx\" (UniqueName: \"kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx\") pod \"5766f81f-890e-44ae-bef1-fe0335b631a1\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.593167 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities\") pod \"5766f81f-890e-44ae-bef1-fe0335b631a1\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.593333 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content\") pod \"5766f81f-890e-44ae-bef1-fe0335b631a1\" (UID: \"5766f81f-890e-44ae-bef1-fe0335b631a1\") " Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.594224 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities" (OuterVolumeSpecName: "utilities") pod "5766f81f-890e-44ae-bef1-fe0335b631a1" (UID: "5766f81f-890e-44ae-bef1-fe0335b631a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.597691 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx" (OuterVolumeSpecName: "kube-api-access-p84sx") pod "5766f81f-890e-44ae-bef1-fe0335b631a1" (UID: "5766f81f-890e-44ae-bef1-fe0335b631a1"). InnerVolumeSpecName "kube-api-access-p84sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.656823 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5766f81f-890e-44ae-bef1-fe0335b631a1" (UID: "5766f81f-890e-44ae-bef1-fe0335b631a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.670589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd8jz" event={"ID":"5766f81f-890e-44ae-bef1-fe0335b631a1","Type":"ContainerDied","Data":"dbb1738481161f0882fef38e1bad6c14a79d9503baf2d0f304342f6e75379640"} Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.670663 4895 scope.go:117] "RemoveContainer" containerID="3acd8b90d15c3c486efcb03e9a62a588ec38afd45d8b9d4987d51ebd4a28c43e" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.670823 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd8jz" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.677095 4895 generic.go:334] "Generic (PLEG): container finished" podID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerID="62c5b718a4a75f74c07414fec26fcd3038aae43c11a723774889806506c8cbbc" exitCode=0 Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.677183 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerDied","Data":"62c5b718a4a75f74c07414fec26fcd3038aae43c11a723774889806506c8cbbc"} Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.694955 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.695005 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p84sx\" (UniqueName: \"kubernetes.io/projected/5766f81f-890e-44ae-bef1-fe0335b631a1-kube-api-access-p84sx\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.695026 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5766f81f-890e-44ae-bef1-fe0335b631a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.702655 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5g6zh" podStartSLOduration=6.4376174729999995 podStartE2EDuration="1m3.702632453s" podCreationTimestamp="2026-03-20 13:23:45 +0000 UTC" firstStartedPulling="2026-03-20 13:23:48.995769791 +0000 UTC m=+128.505488757" lastFinishedPulling="2026-03-20 13:24:46.260784761 +0000 UTC m=+185.770503737" observedRunningTime="2026-03-20 13:24:48.698167489 +0000 UTC m=+188.207886455" watchObservedRunningTime="2026-03-20 13:24:48.702632453 +0000 UTC m=+188.212351459" Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.735987 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.739636 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kd8jz"] Mar 20 13:24:48 crc kubenswrapper[4895]: I0320 13:24:48.749123 4895 scope.go:117] "RemoveContainer" containerID="dd8520c47b8c4c9cff3097da1826b0456cd2edb41b296f47a581cd2ca5ea23f2" Mar 20 13:24:49 crc kubenswrapper[4895]: I0320 13:24:49.158572 4895 scope.go:117] "RemoveContainer" containerID="a341884ee1ddc62e5105651f348685cfbb7e576c10ae68ecc99cb3db814cad4b" Mar 20 13:24:49 crc kubenswrapper[4895]: I0320 13:24:49.232754 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" path="/var/lib/kubelet/pods/5766f81f-890e-44ae-bef1-fe0335b631a1/volumes" Mar 20 13:24:50 crc kubenswrapper[4895]: I0320 13:24:50.693418 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerStarted","Data":"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9"} Mar 20 13:24:50 crc kubenswrapper[4895]: I0320 13:24:50.695932 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerStarted","Data":"4de377c8452ffdd9d09283143ae5a2a875a8355076050728efeff023517a3823"} Mar 20 13:24:50 crc kubenswrapper[4895]: I0320 13:24:50.711058 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpm8f" podStartSLOduration=4.404278794 podStartE2EDuration="1m8.711030792s" podCreationTimestamp="2026-03-20 13:23:42 +0000 UTC" firstStartedPulling="2026-03-20 13:23:44.851678892 +0000 UTC m=+124.361397858" lastFinishedPulling="2026-03-20 13:24:49.15843089 +0000 UTC m=+188.668149856" observedRunningTime="2026-03-20 13:24:50.70978176 +0000 UTC m=+190.219500736" watchObservedRunningTime="2026-03-20 13:24:50.711030792 +0000 UTC m=+190.220749798" Mar 20 13:24:50 crc kubenswrapper[4895]: I0320 13:24:50.734765 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2qwtb" podStartSLOduration=3.505745266 podStartE2EDuration="1m8.734738611s" podCreationTimestamp="2026-03-20 13:23:42 +0000 UTC" firstStartedPulling="2026-03-20 13:23:44.834252425 +0000 UTC m=+124.343971391" lastFinishedPulling="2026-03-20 13:24:50.06324577 +0000 UTC m=+189.572964736" observedRunningTime="2026-03-20 13:24:50.730571424 +0000 UTC m=+190.240290410" watchObservedRunningTime="2026-03-20 13:24:50.734738611 +0000 UTC m=+190.244457587" Mar 20 13:24:52 crc kubenswrapper[4895]: I0320 13:24:52.970046 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:24:52 crc kubenswrapper[4895]: I0320 13:24:52.970484 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:24:53 crc kubenswrapper[4895]: I0320 13:24:53.049001 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:24:53 crc kubenswrapper[4895]: I0320 13:24:53.404992 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:24:53 crc kubenswrapper[4895]: I0320 13:24:53.405673 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:24:53 crc kubenswrapper[4895]: I0320 13:24:53.475728 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:24:54 crc kubenswrapper[4895]: I0320 13:24:54.128574 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q8wls"] Mar 20 13:24:54 crc kubenswrapper[4895]: I0320 13:24:54.786066 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:24:55 crc kubenswrapper[4895]: I0320 13:24:55.410666 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:55 crc kubenswrapper[4895]: I0320 13:24:55.478963 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:55 crc kubenswrapper[4895]: I0320 13:24:55.962464 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:24:56 crc kubenswrapper[4895]: I0320 13:24:56.012072 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:24:56 crc kubenswrapper[4895]: I0320 13:24:56.395549 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:24:56 crc kubenswrapper[4895]: I0320 13:24:56.395608 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:24:56 crc kubenswrapper[4895]: I0320 13:24:56.460115 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:24:56 crc kubenswrapper[4895]: I0320 13:24:56.775483 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.041514 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.042222 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ltcwm" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="registry-server" containerID="cri-o://6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191" gracePeriod=2 Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.697686 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.727880 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9h6f\" (UniqueName: \"kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f\") pod \"50352717-2200-417f-b1ff-7e9adbe0cbf8\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.728017 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content\") pod \"50352717-2200-417f-b1ff-7e9adbe0cbf8\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.728050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities\") pod \"50352717-2200-417f-b1ff-7e9adbe0cbf8\" (UID: \"50352717-2200-417f-b1ff-7e9adbe0cbf8\") " Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.728778 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities" (OuterVolumeSpecName: "utilities") pod "50352717-2200-417f-b1ff-7e9adbe0cbf8" (UID: "50352717-2200-417f-b1ff-7e9adbe0cbf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.733756 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f" (OuterVolumeSpecName: "kube-api-access-j9h6f") pod "50352717-2200-417f-b1ff-7e9adbe0cbf8" (UID: "50352717-2200-417f-b1ff-7e9adbe0cbf8"). InnerVolumeSpecName "kube-api-access-j9h6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.741349 4895 generic.go:334] "Generic (PLEG): container finished" podID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerID="6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191" exitCode=0 Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.741407 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerDied","Data":"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191"} Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.741437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltcwm" event={"ID":"50352717-2200-417f-b1ff-7e9adbe0cbf8","Type":"ContainerDied","Data":"91b08e7fd4a16d7dfde34cb2bf6d5a940d441e902bdd591920d62f102f9e6ee8"} Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.741457 4895 scope.go:117] "RemoveContainer" containerID="6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.741579 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltcwm" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.753969 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50352717-2200-417f-b1ff-7e9adbe0cbf8" (UID: "50352717-2200-417f-b1ff-7e9adbe0cbf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.760968 4895 scope.go:117] "RemoveContainer" containerID="6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.778606 4895 scope.go:117] "RemoveContainer" containerID="858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801079 4895 scope.go:117] "RemoveContainer" containerID="6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191" Mar 20 13:24:58 crc kubenswrapper[4895]: E0320 13:24:58.801438 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191\": container with ID starting with 6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191 not found: ID does not exist" containerID="6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801471 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191"} err="failed to get container status \"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191\": rpc error: code = NotFound desc = could not find container \"6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191\": container with ID starting with 6e8d5513ba2dd2b01198f8ec42a2de3ba7ee24f5b3987f5b33b48e4c0045f191 not found: ID does not exist" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801494 4895 scope.go:117] "RemoveContainer" containerID="6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5" Mar 20 13:24:58 crc kubenswrapper[4895]: E0320 13:24:58.801719 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5\": container with ID starting with 6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5 not found: ID does not exist" containerID="6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801745 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5"} err="failed to get container status \"6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5\": rpc error: code = NotFound desc = could not find container \"6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5\": container with ID starting with 6354aed44aafbd2f569d07fe819096f13a4f3367e1e72335eabecd34719b09a5 not found: ID does not exist" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801759 4895 scope.go:117] "RemoveContainer" containerID="858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67" Mar 20 13:24:58 crc kubenswrapper[4895]: E0320 13:24:58.801956 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67\": container with ID starting with 858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67 not found: ID does not exist" containerID="858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.801977 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67"} err="failed to get container status \"858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67\": rpc error: code = NotFound desc = could not find container \"858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67\": container with ID starting with 858e4bbc03a2048e6302d5b6019bcbf4808c88535f290ba67051aeb757700f67 not found: ID does not exist" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.829156 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.829186 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50352717-2200-417f-b1ff-7e9adbe0cbf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.829197 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9h6f\" (UniqueName: \"kubernetes.io/projected/50352717-2200-417f-b1ff-7e9adbe0cbf8-kube-api-access-j9h6f\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.878831 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.879046 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" podUID="278c6500-c7e2-427a-a477-81179f04717d" containerName="controller-manager" containerID="cri-o://0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409" gracePeriod=30 Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.992845 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:58 crc kubenswrapper[4895]: I0320 13:24:58.993041 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" podUID="77ae1250-46a0-4a18-b90b-c686cf9f796d" containerName="route-controller-manager" containerID="cri-o://66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c" gracePeriod=30 Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.092713 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.097625 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltcwm"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.220681 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" path="/var/lib/kubelet/pods/50352717-2200-417f-b1ff-7e9adbe0cbf8/volumes" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.477049 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.481563 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles\") pod \"278c6500-c7e2-427a-a477-81179f04717d\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548330 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlljw\" (UniqueName: \"kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw\") pod \"77ae1250-46a0-4a18-b90b-c686cf9f796d\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548401 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert\") pod \"77ae1250-46a0-4a18-b90b-c686cf9f796d\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548436 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca\") pod \"278c6500-c7e2-427a-a477-81179f04717d\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548482 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config\") pod \"278c6500-c7e2-427a-a477-81179f04717d\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt6gk\" (UniqueName: \"kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk\") pod \"278c6500-c7e2-427a-a477-81179f04717d\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548523 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config\") pod \"77ae1250-46a0-4a18-b90b-c686cf9f796d\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca\") pod \"77ae1250-46a0-4a18-b90b-c686cf9f796d\" (UID: \"77ae1250-46a0-4a18-b90b-c686cf9f796d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.548572 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert\") pod \"278c6500-c7e2-427a-a477-81179f04717d\" (UID: \"278c6500-c7e2-427a-a477-81179f04717d\") " Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.550306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca" (OuterVolumeSpecName: "client-ca") pod "77ae1250-46a0-4a18-b90b-c686cf9f796d" (UID: "77ae1250-46a0-4a18-b90b-c686cf9f796d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.550366 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "278c6500-c7e2-427a-a477-81179f04717d" (UID: "278c6500-c7e2-427a-a477-81179f04717d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.550439 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca" (OuterVolumeSpecName: "client-ca") pod "278c6500-c7e2-427a-a477-81179f04717d" (UID: "278c6500-c7e2-427a-a477-81179f04717d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.550591 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config" (OuterVolumeSpecName: "config") pod "77ae1250-46a0-4a18-b90b-c686cf9f796d" (UID: "77ae1250-46a0-4a18-b90b-c686cf9f796d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.551036 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config" (OuterVolumeSpecName: "config") pod "278c6500-c7e2-427a-a477-81179f04717d" (UID: "278c6500-c7e2-427a-a477-81179f04717d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.554462 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77ae1250-46a0-4a18-b90b-c686cf9f796d" (UID: "77ae1250-46a0-4a18-b90b-c686cf9f796d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.555585 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk" (OuterVolumeSpecName: "kube-api-access-bt6gk") pod "278c6500-c7e2-427a-a477-81179f04717d" (UID: "278c6500-c7e2-427a-a477-81179f04717d"). InnerVolumeSpecName "kube-api-access-bt6gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.561279 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "278c6500-c7e2-427a-a477-81179f04717d" (UID: "278c6500-c7e2-427a-a477-81179f04717d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.561544 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw" (OuterVolumeSpecName: "kube-api-access-rlljw") pod "77ae1250-46a0-4a18-b90b-c686cf9f796d" (UID: "77ae1250-46a0-4a18-b90b-c686cf9f796d"). InnerVolumeSpecName "kube-api-access-rlljw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650195 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650252 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278c6500-c7e2-427a-a477-81179f04717d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650263 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650275 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlljw\" (UniqueName: \"kubernetes.io/projected/77ae1250-46a0-4a18-b90b-c686cf9f796d-kube-api-access-rlljw\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650287 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ae1250-46a0-4a18-b90b-c686cf9f796d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650300 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650310 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278c6500-c7e2-427a-a477-81179f04717d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650320 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt6gk\" (UniqueName: \"kubernetes.io/projected/278c6500-c7e2-427a-a477-81179f04717d-kube-api-access-bt6gk\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.650331 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ae1250-46a0-4a18-b90b-c686cf9f796d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.747135 4895 generic.go:334] "Generic (PLEG): container finished" podID="278c6500-c7e2-427a-a477-81179f04717d" containerID="0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409" exitCode=0 Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.747178 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" event={"ID":"278c6500-c7e2-427a-a477-81179f04717d","Type":"ContainerDied","Data":"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409"} Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.747214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" event={"ID":"278c6500-c7e2-427a-a477-81179f04717d","Type":"ContainerDied","Data":"6c7276e271afb450729a512afc47fce1b13f88e3cd858a2607c3c12ae5b29fb1"} Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.747211 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67fc78d778-4j5nv" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.747230 4895 scope.go:117] "RemoveContainer" containerID="0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.753603 4895 generic.go:334] "Generic (PLEG): container finished" podID="77ae1250-46a0-4a18-b90b-c686cf9f796d" containerID="66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c" exitCode=0 Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.753645 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.753641 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" event={"ID":"77ae1250-46a0-4a18-b90b-c686cf9f796d","Type":"ContainerDied","Data":"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c"} Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.753791 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27" event={"ID":"77ae1250-46a0-4a18-b90b-c686cf9f796d","Type":"ContainerDied","Data":"9c94cf28a4dc4d5fbbf2fcba49224832fdf8ab232bff0db4b736ac753d96c595"} Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.766867 4895 scope.go:117] "RemoveContainer" containerID="0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.767633 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409\": container with ID starting with 0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409 not found: ID does not exist" containerID="0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.767669 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409"} err="failed to get container status \"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409\": rpc error: code = NotFound desc = could not find container \"0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409\": container with ID starting with 0acbd71707b8e21b87bd3e8e8c16549b56de130d4e528c63fe5c64c3472c2409 not found: ID does not exist" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.767696 4895 scope.go:117] "RemoveContainer" containerID="66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.782120 4895 scope.go:117] "RemoveContainer" containerID="66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.782587 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c\": container with ID starting with 66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c not found: ID does not exist" containerID="66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.782631 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c"} err="failed to get container status \"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c\": rpc error: code = NotFound desc = could not find container \"66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c\": container with ID starting with 66d98532c22d8717d1f6027541c83d9360427d36d763efcba3ade044ce2ff16c not found: ID does not exist" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.785236 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.787921 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bc8dfcc6b-kxl27"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.793604 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.798258 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67fc78d778-4j5nv"] Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979529 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-559f966cf9-lxz8d"] Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979717 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ae1250-46a0-4a18-b90b-c686cf9f796d" containerName="route-controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979728 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ae1250-46a0-4a18-b90b-c686cf9f796d" containerName="route-controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979738 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c6500-c7e2-427a-a477-81179f04717d" containerName="controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979744 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c6500-c7e2-427a-a477-81179f04717d" containerName="controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979755 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979760 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979771 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="extract-utilities" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979777 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="extract-utilities" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979785 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="extract-content" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979791 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="extract-content" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979801 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="extract-content" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979807 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="extract-content" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979815 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="extract-utilities" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979822 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="extract-utilities" Mar 20 13:24:59 crc kubenswrapper[4895]: E0320 13:24:59.979830 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979835 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979918 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5766f81f-890e-44ae-bef1-fe0335b631a1" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979929 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ae1250-46a0-4a18-b90b-c686cf9f796d" containerName="route-controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979938 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="278c6500-c7e2-427a-a477-81179f04717d" containerName="controller-manager" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.979944 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="50352717-2200-417f-b1ff-7e9adbe0cbf8" containerName="registry-server" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.980258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.982385 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.982639 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.982782 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.982899 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.983060 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.983222 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.991363 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:24:59 crc kubenswrapper[4895]: I0320 13:24:59.995164 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559f966cf9-lxz8d"] Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.054454 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-client-ca\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.054500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-config\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.054527 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-proxy-ca-bundles\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.054554 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e84766-e65e-4507-8b8b-1f501c5dc909-serving-cert\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.054751 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrq2\" (UniqueName: \"kubernetes.io/projected/55e84766-e65e-4507-8b8b-1f501c5dc909-kube-api-access-jhrq2\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.156172 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrq2\" (UniqueName: \"kubernetes.io/projected/55e84766-e65e-4507-8b8b-1f501c5dc909-kube-api-access-jhrq2\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.156248 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-client-ca\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.156281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-config\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.156307 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-proxy-ca-bundles\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.156335 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e84766-e65e-4507-8b8b-1f501c5dc909-serving-cert\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.157433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-client-ca\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.157696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-config\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.158127 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55e84766-e65e-4507-8b8b-1f501c5dc909-proxy-ca-bundles\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.161141 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e84766-e65e-4507-8b8b-1f501c5dc909-serving-cert\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.178908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrq2\" (UniqueName: \"kubernetes.io/projected/55e84766-e65e-4507-8b8b-1f501c5dc909-kube-api-access-jhrq2\") pod \"controller-manager-559f966cf9-lxz8d\" (UID: \"55e84766-e65e-4507-8b8b-1f501c5dc909\") " pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.298634 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.441152 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.441421 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5g6zh" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="registry-server" containerID="cri-o://d38ced92836b8c43f7157bf6fb56e6855c09477fa3c5c16bdaa65eeada770679" gracePeriod=2 Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.531263 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559f966cf9-lxz8d"] Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.765126 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" event={"ID":"55e84766-e65e-4507-8b8b-1f501c5dc909","Type":"ContainerStarted","Data":"d6d24059b0eef13aa47fa8e467f26c6f9777c16cc01a1665a20f50e3f57d0312"} Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.769291 4895 generic.go:334] "Generic (PLEG): container finished" podID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerID="d38ced92836b8c43f7157bf6fb56e6855c09477fa3c5c16bdaa65eeada770679" exitCode=0 Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.769356 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerDied","Data":"d38ced92836b8c43f7157bf6fb56e6855c09477fa3c5c16bdaa65eeada770679"} Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.830917 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.885477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgqht\" (UniqueName: \"kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht\") pod \"23478f5b-63b0-4a43-a716-9d22fad71c2c\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.885577 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content\") pod \"23478f5b-63b0-4a43-a716-9d22fad71c2c\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.885599 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities\") pod \"23478f5b-63b0-4a43-a716-9d22fad71c2c\" (UID: \"23478f5b-63b0-4a43-a716-9d22fad71c2c\") " Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.886445 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities" (OuterVolumeSpecName: "utilities") pod "23478f5b-63b0-4a43-a716-9d22fad71c2c" (UID: "23478f5b-63b0-4a43-a716-9d22fad71c2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.886529 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.890765 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht" (OuterVolumeSpecName: "kube-api-access-mgqht") pod "23478f5b-63b0-4a43-a716-9d22fad71c2c" (UID: "23478f5b-63b0-4a43-a716-9d22fad71c2c"). InnerVolumeSpecName "kube-api-access-mgqht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.978454 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz"] Mar 20 13:25:00 crc kubenswrapper[4895]: E0320 13:25:00.978761 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="registry-server" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.978783 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="registry-server" Mar 20 13:25:00 crc kubenswrapper[4895]: E0320 13:25:00.978804 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="extract-content" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.978813 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="extract-content" Mar 20 13:25:00 crc kubenswrapper[4895]: E0320 13:25:00.978829 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="extract-utilities" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.978839 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="extract-utilities" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.978944 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" containerName="registry-server" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.979493 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.987287 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e21ca8-51b2-4312-b367-f37dca4b2008-serving-cert\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.987342 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-client-ca\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.987378 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-config\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.987512 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsvc\" (UniqueName: \"kubernetes.io/projected/b7e21ca8-51b2-4312-b367-f37dca4b2008-kube-api-access-5rsvc\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.987582 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgqht\" (UniqueName: \"kubernetes.io/projected/23478f5b-63b0-4a43-a716-9d22fad71c2c-kube-api-access-mgqht\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.995464 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.995881 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.996034 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:00 crc kubenswrapper[4895]: I0320 13:25:00.996124 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.008194 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.008598 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.013175 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz"] Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.018362 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23478f5b-63b0-4a43-a716-9d22fad71c2c" (UID: "23478f5b-63b0-4a43-a716-9d22fad71c2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.088084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e21ca8-51b2-4312-b367-f37dca4b2008-serving-cert\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.088373 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-client-ca\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.088548 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-config\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.088712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsvc\" (UniqueName: \"kubernetes.io/projected/b7e21ca8-51b2-4312-b367-f37dca4b2008-kube-api-access-5rsvc\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.088854 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23478f5b-63b0-4a43-a716-9d22fad71c2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.089372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-client-ca\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.089826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7e21ca8-51b2-4312-b367-f37dca4b2008-config\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.095372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e21ca8-51b2-4312-b367-f37dca4b2008-serving-cert\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.113108 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsvc\" (UniqueName: \"kubernetes.io/projected/b7e21ca8-51b2-4312-b367-f37dca4b2008-kube-api-access-5rsvc\") pod \"route-controller-manager-577cd9d876-qn5bz\" (UID: \"b7e21ca8-51b2-4312-b367-f37dca4b2008\") " pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.235380 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278c6500-c7e2-427a-a477-81179f04717d" path="/var/lib/kubelet/pods/278c6500-c7e2-427a-a477-81179f04717d/volumes" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.236427 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ae1250-46a0-4a18-b90b-c686cf9f796d" path="/var/lib/kubelet/pods/77ae1250-46a0-4a18-b90b-c686cf9f796d/volumes" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.297273 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.483605 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz"] Mar 20 13:25:01 crc kubenswrapper[4895]: W0320 13:25:01.488859 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e21ca8_51b2_4312_b367_f37dca4b2008.slice/crio-90afaa23507400152aeefa6d87458e05e3959c7ad92481975e5581d2a1aad124 WatchSource:0}: Error finding container 90afaa23507400152aeefa6d87458e05e3959c7ad92481975e5581d2a1aad124: Status 404 returned error can't find the container with id 90afaa23507400152aeefa6d87458e05e3959c7ad92481975e5581d2a1aad124 Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.777828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" event={"ID":"b7e21ca8-51b2-4312-b367-f37dca4b2008","Type":"ContainerStarted","Data":"2cec74b399c4d63a1ee9dc41502b48738b6319d949b9a1c908a06b130c5e86f6"} Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.777882 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" event={"ID":"b7e21ca8-51b2-4312-b367-f37dca4b2008","Type":"ContainerStarted","Data":"90afaa23507400152aeefa6d87458e05e3959c7ad92481975e5581d2a1aad124"} Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.778167 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.780670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" event={"ID":"55e84766-e65e-4507-8b8b-1f501c5dc909","Type":"ContainerStarted","Data":"066cf3aa733e075a51d0b4519a830465d03bad06ce3c16fed5a2ecd7bfca1a65"} Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.780817 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.783737 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5g6zh" event={"ID":"23478f5b-63b0-4a43-a716-9d22fad71c2c","Type":"ContainerDied","Data":"a94c2c2dfad2b953c8ffd52a627ffbe838d285f944ce9e464702460fe4e6747f"} Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.783779 4895 scope.go:117] "RemoveContainer" containerID="d38ced92836b8c43f7157bf6fb56e6855c09477fa3c5c16bdaa65eeada770679" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.783828 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5g6zh" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.787621 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.796049 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" podStartSLOduration=2.796036158 podStartE2EDuration="2.796036158s" podCreationTimestamp="2026-03-20 13:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:01.793284478 +0000 UTC m=+201.303003444" watchObservedRunningTime="2026-03-20 13:25:01.796036158 +0000 UTC m=+201.305755124" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.799706 4895 scope.go:117] "RemoveContainer" containerID="fdf76b0110900f43a20951fb7ae272aaef7451684c4507b5331612ba2c95f6e6" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.810241 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-559f966cf9-lxz8d" podStartSLOduration=3.810222083 podStartE2EDuration="3.810222083s" podCreationTimestamp="2026-03-20 13:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:01.808880328 +0000 UTC m=+201.318599294" watchObservedRunningTime="2026-03-20 13:25:01.810222083 +0000 UTC m=+201.319941049" Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.823540 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.825997 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5g6zh"] Mar 20 13:25:01 crc kubenswrapper[4895]: I0320 13:25:01.829750 4895 scope.go:117] "RemoveContainer" containerID="02d37e2a658a62e476ec57f75a75370f33365181af42940b24d1cf8d5947e1ff" Mar 20 13:25:02 crc kubenswrapper[4895]: I0320 13:25:02.315016 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-577cd9d876-qn5bz" Mar 20 13:25:03 crc kubenswrapper[4895]: I0320 13:25:03.218670 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23478f5b-63b0-4a43-a716-9d22fad71c2c" path="/var/lib/kubelet/pods/23478f5b-63b0-4a43-a716-9d22fad71c2c/volumes" Mar 20 13:25:03 crc kubenswrapper[4895]: I0320 13:25:03.463884 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.440261 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qwtb"] Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.440818 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2qwtb" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="registry-server" containerID="cri-o://4de377c8452ffdd9d09283143ae5a2a875a8355076050728efeff023517a3823" gracePeriod=2 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.807585 4895 generic.go:334] "Generic (PLEG): container finished" podID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerID="4de377c8452ffdd9d09283143ae5a2a875a8355076050728efeff023517a3823" exitCode=0 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.807620 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerDied","Data":"4de377c8452ffdd9d09283143ae5a2a875a8355076050728efeff023517a3823"} Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.807646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qwtb" event={"ID":"a71969a9-97c9-46c4-9e1c-051f3c86ae91","Type":"ContainerDied","Data":"3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7"} Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.807658 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e1a71e55052782f8bdb3ebf67030b55374aff5d068ae0dd87b6ba35c93cbcc7" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835204 4895 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.835422 4895 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835490 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946" gracePeriod=15 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835512 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57" gracePeriod=15 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835554 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a" gracePeriod=15 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835580 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183" gracePeriod=15 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.835580 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94" gracePeriod=15 Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.837447 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838015 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838033 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838046 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838053 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838064 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838072 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838081 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838088 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838102 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838110 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838124 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838131 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838143 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838148 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838158 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838164 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838275 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838286 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838295 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838302 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838310 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838319 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838326 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838333 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838465 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838475 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: E0320 13:25:05.838483 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838490 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.838569 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.839527 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.839930 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843870 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843914 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843935 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843981 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.843996 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.844031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.844056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.844324 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.846633 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.898426 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.944573 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct47f\" (UniqueName: \"kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f\") pod \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.944683 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content\") pod \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.944825 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities\") pod \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\" (UID: \"a71969a9-97c9-46c4-9e1c-051f3c86ae91\") " Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945120 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945182 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945211 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945230 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945323 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945371 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945412 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945449 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945465 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.945477 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.948308 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities" (OuterVolumeSpecName: "utilities") pod "a71969a9-97c9-46c4-9e1c-051f3c86ae91" (UID: "a71969a9-97c9-46c4-9e1c-051f3c86ae91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.952108 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f" (OuterVolumeSpecName: "kube-api-access-ct47f") pod "a71969a9-97c9-46c4-9e1c-051f3c86ae91" (UID: "a71969a9-97c9-46c4-9e1c-051f3c86ae91"). InnerVolumeSpecName "kube-api-access-ct47f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:05 crc kubenswrapper[4895]: I0320 13:25:05.997856 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a71969a9-97c9-46c4-9e1c-051f3c86ae91" (UID: "a71969a9-97c9-46c4-9e1c-051f3c86ae91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.046988 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.047279 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct47f\" (UniqueName: \"kubernetes.io/projected/a71969a9-97c9-46c4-9e1c-051f3c86ae91-kube-api-access-ct47f\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.047382 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a71969a9-97c9-46c4-9e1c-051f3c86ae91-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.194501 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:06 crc kubenswrapper[4895]: W0320 13:25:06.219475 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-56a45def096a809d626d6c64d2792c1400a9c3a8baebf35d372f94d2837ed6f0 WatchSource:0}: Error finding container 56a45def096a809d626d6c64d2792c1400a9c3a8baebf35d372f94d2837ed6f0: Status 404 returned error can't find the container with id 56a45def096a809d626d6c64d2792c1400a9c3a8baebf35d372f94d2837ed6f0 Mar 20 13:25:06 crc kubenswrapper[4895]: E0320 13:25:06.223455 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8f855f3b0627 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:25:06.221680167 +0000 UTC m=+205.731399133,LastTimestamp:2026-03-20 13:25:06.221680167 +0000 UTC m=+205.731399133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.817985 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.820457 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.821368 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57" exitCode=0 Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.821413 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a" exitCode=0 Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.821424 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183" exitCode=0 Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.821434 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94" exitCode=2 Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.821475 4895 scope.go:117] "RemoveContainer" containerID="a5166a4d1f0019a334339c9ea7f1a8ae2e72579202b9a640929e4aacf74eb86b" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.823939 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260"} Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.823977 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"56a45def096a809d626d6c64d2792c1400a9c3a8baebf35d372f94d2837ed6f0"} Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.824900 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.826207 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4c1431f-0be0-47f1-b86f-7a41d5015305" containerID="7049025af6f5e79b7ea415036ed543f2828dfe538b6af607b13db0a4608d6741" exitCode=0 Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.826304 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qwtb" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.826303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a4c1431f-0be0-47f1-b86f-7a41d5015305","Type":"ContainerDied","Data":"7049025af6f5e79b7ea415036ed543f2828dfe538b6af607b13db0a4608d6741"} Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.826958 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.827548 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.827848 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.828042 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.828229 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.866169 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.866979 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:06 crc kubenswrapper[4895]: I0320 13:25:06.867617 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:07 crc kubenswrapper[4895]: I0320 13:25:07.836252 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.296556 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.297658 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.298195 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.298559 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.298848 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.299091 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.299946 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.300277 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.300619 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.300819 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.301049 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.477852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access\") pod \"a4c1431f-0be0-47f1-b86f-7a41d5015305\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.477901 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.477977 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478024 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir\") pod \"a4c1431f-0be0-47f1-b86f-7a41d5015305\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478071 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock\") pod \"a4c1431f-0be0-47f1-b86f-7a41d5015305\" (UID: \"a4c1431f-0be0-47f1-b86f-7a41d5015305\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478146 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478273 4895 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478309 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478339 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a4c1431f-0be0-47f1-b86f-7a41d5015305" (UID: "a4c1431f-0be0-47f1-b86f-7a41d5015305"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478354 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock" (OuterVolumeSpecName: "var-lock") pod "a4c1431f-0be0-47f1-b86f-7a41d5015305" (UID: "a4c1431f-0be0-47f1-b86f-7a41d5015305"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.478367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.482588 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a4c1431f-0be0-47f1-b86f-7a41d5015305" (UID: "a4c1431f-0be0-47f1-b86f-7a41d5015305"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.579797 4895 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.579870 4895 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.579885 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c1431f-0be0-47f1-b86f-7a41d5015305-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.579902 4895 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.579917 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c1431f-0be0-47f1-b86f-7a41d5015305-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.851472 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.854459 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946" exitCode=0 Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.854602 4895 scope.go:117] "RemoveContainer" containerID="17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.854648 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.859386 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a4c1431f-0be0-47f1-b86f-7a41d5015305","Type":"ContainerDied","Data":"c9d26a79e7693da8d798cca94d5e65094a8b98a57c2581035405c14f2a8f1106"} Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.859591 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9d26a79e7693da8d798cca94d5e65094a8b98a57c2581035405c14f2a8f1106" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.859617 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.887648 4895 scope.go:117] "RemoveContainer" containerID="8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.895426 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.895916 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.896336 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.896744 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.897649 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.898094 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.898579 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.898942 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.912660 4895 scope.go:117] "RemoveContainer" containerID="27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.929121 4895 scope.go:117] "RemoveContainer" containerID="af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.953675 4895 scope.go:117] "RemoveContainer" containerID="a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.975214 4895 scope.go:117] "RemoveContainer" containerID="d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.995902 4895 scope.go:117] "RemoveContainer" containerID="17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.996809 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57\": container with ID starting with 17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57 not found: ID does not exist" containerID="17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.996948 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57"} err="failed to get container status \"17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57\": rpc error: code = NotFound desc = could not find container \"17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57\": container with ID starting with 17619decf442c8e0c48a32c927eb6a3a67cb40ba99870c77037350f40588fa57 not found: ID does not exist" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.997115 4895 scope.go:117] "RemoveContainer" containerID="8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.997664 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a\": container with ID starting with 8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a not found: ID does not exist" containerID="8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.997798 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a"} err="failed to get container status \"8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a\": rpc error: code = NotFound desc = could not find container \"8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a\": container with ID starting with 8a77afa4256cc0ceb798b2f03495c0efd8557e593bb726505c1bc1ce7169ca6a not found: ID does not exist" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.997826 4895 scope.go:117] "RemoveContainer" containerID="27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.998181 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183\": container with ID starting with 27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183 not found: ID does not exist" containerID="27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.998212 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183"} err="failed to get container status \"27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183\": rpc error: code = NotFound desc = could not find container \"27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183\": container with ID starting with 27eddfe53065bcc41251953e53b56e1fcee1a3e2dc8e0ff9a9e3da1446c08183 not found: ID does not exist" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.998232 4895 scope.go:117] "RemoveContainer" containerID="af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.998660 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94\": container with ID starting with af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94 not found: ID does not exist" containerID="af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.998685 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94"} err="failed to get container status \"af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94\": rpc error: code = NotFound desc = could not find container \"af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94\": container with ID starting with af957f143a8343aa68e6d5f47295732edc30ab9ddf7ce40126b4ada5fe35dc94 not found: ID does not exist" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.998701 4895 scope.go:117] "RemoveContainer" containerID="a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.998992 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946\": container with ID starting with a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946 not found: ID does not exist" containerID="a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.999011 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946"} err="failed to get container status \"a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946\": rpc error: code = NotFound desc = could not find container \"a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946\": container with ID starting with a74748beaced9e421972af5314193b5824a52565b1797490b7bc277b36ae7946 not found: ID does not exist" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.999025 4895 scope.go:117] "RemoveContainer" containerID="d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5" Mar 20 13:25:08 crc kubenswrapper[4895]: E0320 13:25:08.999450 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5\": container with ID starting with d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5 not found: ID does not exist" containerID="d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5" Mar 20 13:25:08 crc kubenswrapper[4895]: I0320 13:25:08.999543 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5"} err="failed to get container status \"d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5\": rpc error: code = NotFound desc = could not find container \"d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5\": container with ID starting with d1f8f57b387f1901a3cad09b789d5db264c3961b0bb57f168fc4f154140547e5 not found: ID does not exist" Mar 20 13:25:09 crc kubenswrapper[4895]: I0320 13:25:09.220460 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:25:10 crc kubenswrapper[4895]: E0320 13:25:10.932734 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8f855f3b0627 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:25:06.221680167 +0000 UTC m=+205.731399133,LastTimestamp:2026-03-20 13:25:06.221680167 +0000 UTC m=+205.731399133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:25:11 crc kubenswrapper[4895]: I0320 13:25:11.215279 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:11 crc kubenswrapper[4895]: I0320 13:25:11.215676 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:11 crc kubenswrapper[4895]: I0320 13:25:11.216015 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.563094 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.564516 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.565078 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.565606 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.566080 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:13 crc kubenswrapper[4895]: I0320 13:25:13.566146 4895 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.566670 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Mar 20 13:25:13 crc kubenswrapper[4895]: E0320 13:25:13.768558 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Mar 20 13:25:14 crc kubenswrapper[4895]: E0320 13:25:14.169636 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Mar 20 13:25:14 crc kubenswrapper[4895]: E0320 13:25:14.971377 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Mar 20 13:25:16 crc kubenswrapper[4895]: E0320 13:25:16.573778 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="3.2s" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.210921 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.212769 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.213364 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.213956 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.242501 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.242597 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:17 crc kubenswrapper[4895]: E0320 13:25:17.243224 4895 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:17 crc kubenswrapper[4895]: I0320 13:25:17.244130 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:17 crc kubenswrapper[4895]: W0320 13:25:17.278196 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-0cf1088c4aea8d011c3bf7a10056dfceb59bdd320c808f666ea86b6432d7c3fe WatchSource:0}: Error finding container 0cf1088c4aea8d011c3bf7a10056dfceb59bdd320c808f666ea86b6432d7c3fe: Status 404 returned error can't find the container with id 0cf1088c4aea8d011c3bf7a10056dfceb59bdd320c808f666ea86b6432d7c3fe Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.126076 4895 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c096210f4c84d4e0483f82a163d27006cd888073e9de3becedc7c7664520f4b0" exitCode=0 Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.126189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c096210f4c84d4e0483f82a163d27006cd888073e9de3becedc7c7664520f4b0"} Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.127140 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0cf1088c4aea8d011c3bf7a10056dfceb59bdd320c808f666ea86b6432d7c3fe"} Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.127607 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.127636 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:18 crc kubenswrapper[4895]: E0320 13:25:18.128027 4895 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.128009 4895 status_manager.go:851] "Failed to get status for pod" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.130140 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:18 crc kubenswrapper[4895]: I0320 13:25:18.130823 4895 status_manager.go:851] "Failed to get status for pod" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" pod="openshift-marketplace/community-operators-2qwtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2qwtb\": dial tcp 38.102.83.82:6443: connect: connection refused" Mar 20 13:25:19 crc kubenswrapper[4895]: I0320 13:25:19.136019 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0b8d78e03c87eacf88ad6653c5a865b4eb4bc7b6d5d53952337a0be5b4fa2bf1"} Mar 20 13:25:19 crc kubenswrapper[4895]: I0320 13:25:19.136301 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe1e5df6097a43e021c28056d0c4d0e4198d8e81fcccfc9e871d622c8e7f8352"} Mar 20 13:25:19 crc kubenswrapper[4895]: I0320 13:25:19.151868 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" podUID="64a35300-9f9f-44c7-a1ff-d818032e001a" containerName="oauth-openshift" containerID="cri-o://bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7" gracePeriod=15 Mar 20 13:25:19 crc kubenswrapper[4895]: I0320 13:25:19.695910 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.144270 4895 generic.go:334] "Generic (PLEG): container finished" podID="64a35300-9f9f-44c7-a1ff-d818032e001a" containerID="bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7" exitCode=0 Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.144345 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.144349 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" event={"ID":"64a35300-9f9f-44c7-a1ff-d818032e001a","Type":"ContainerDied","Data":"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7"} Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.144454 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q8wls" event={"ID":"64a35300-9f9f-44c7-a1ff-d818032e001a","Type":"ContainerDied","Data":"fe5425d1d0ad848e1de834fdf844787c4268cbf06fd63aac0aeb5170cd345cfc"} Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.144501 4895 scope.go:117] "RemoveContainer" containerID="bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.148300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a003f1bb275673f046b782979870d8826b7f78947f38cc656b5a0a51a9780a6"} Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.148351 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"326a0f85483c7f74683ca0d232a54746d7e1f1d6aad74d7bba9f9e76841d8e69"} Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.148373 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dd6a29d370f03ed2d2d7eceb00c78a2938c4d81dce6e74c7fe86cce3d728b370"} Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.149307 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.149385 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.149442 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.170769 4895 scope.go:117] "RemoveContainer" containerID="bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7" Mar 20 13:25:20 crc kubenswrapper[4895]: E0320 13:25:20.171173 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7\": container with ID starting with bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7 not found: ID does not exist" containerID="bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7" Mar 20 13:25:20 crc kubenswrapper[4895]: I0320 13:25:20.171232 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7"} err="failed to get container status \"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7\": rpc error: code = NotFound desc = could not find container \"bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7\": container with ID starting with bc21f4c2b08416071fd004d4f78ba2759e49365cefce65e8b7937bc9fc9ca5e7 not found: ID does not exist" Mar 20 13:25:21 crc kubenswrapper[4895]: I0320 13:25:21.159737 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:25:21 crc kubenswrapper[4895]: I0320 13:25:21.160968 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:25:21 crc kubenswrapper[4895]: I0320 13:25:21.161057 4895 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ae27d8e0648506efab87e9191454765762f4bf5387968c9227cd1717d72ad478" exitCode=1 Mar 20 13:25:21 crc kubenswrapper[4895]: I0320 13:25:21.161112 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ae27d8e0648506efab87e9191454765762f4bf5387968c9227cd1717d72ad478"} Mar 20 13:25:21 crc kubenswrapper[4895]: I0320 13:25:21.161860 4895 scope.go:117] "RemoveContainer" containerID="ae27d8e0648506efab87e9191454765762f4bf5387968c9227cd1717d72ad478" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.175351 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.176598 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.176680 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f1c619cb496dd91a8b7adaf82ac0055874a6401b7b5bc8a4d6733c504b78b04c"} Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.244435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.244489 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.251329 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.297332 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.297450 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:25:22 crc kubenswrapper[4895]: I0320 13:25:22.526640 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.810872 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.813004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.813774 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.813916 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814242 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814575 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5rtl\" (UniqueName: \"kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814709 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814824 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.812940 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.813905 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.814887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.815167 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.815264 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.815446 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.815567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.815691 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session\") pod \"64a35300-9f9f-44c7-a1ff-d818032e001a\" (UID: \"64a35300-9f9f-44c7-a1ff-d818032e001a\") " Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.816410 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.816528 4895 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.816617 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.816696 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.817034 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.858722 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.858781 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl" (OuterVolumeSpecName: "kube-api-access-m5rtl") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "kube-api-access-m5rtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.859841 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.860497 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.862871 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.863677 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.866842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.868985 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.871503 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "64a35300-9f9f-44c7-a1ff-d818032e001a" (UID: "64a35300-9f9f-44c7-a1ff-d818032e001a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917809 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917850 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917868 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5rtl\" (UniqueName: \"kubernetes.io/projected/64a35300-9f9f-44c7-a1ff-d818032e001a-kube-api-access-m5rtl\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917886 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917904 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917922 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917942 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917958 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917974 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:24 crc kubenswrapper[4895]: I0320 13:25:24.917990 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64a35300-9f9f-44c7-a1ff-d818032e001a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:25 crc kubenswrapper[4895]: I0320 13:25:25.172164 4895 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:26 crc kubenswrapper[4895]: I0320 13:25:26.201321 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:26 crc kubenswrapper[4895]: I0320 13:25:26.201690 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:26 crc kubenswrapper[4895]: I0320 13:25:26.209378 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:26 crc kubenswrapper[4895]: I0320 13:25:26.212713 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="56063317-ef26-43cb-80a5-8c71f7e49089" Mar 20 13:25:27 crc kubenswrapper[4895]: I0320 13:25:27.208455 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:27 crc kubenswrapper[4895]: I0320 13:25:27.208512 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:28 crc kubenswrapper[4895]: I0320 13:25:28.320655 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:25:28 crc kubenswrapper[4895]: I0320 13:25:28.327576 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:25:31 crc kubenswrapper[4895]: I0320 13:25:31.234957 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="56063317-ef26-43cb-80a5-8c71f7e49089" Mar 20 13:25:32 crc kubenswrapper[4895]: I0320 13:25:32.532385 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:25:35 crc kubenswrapper[4895]: I0320 13:25:35.270839 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:25:35 crc kubenswrapper[4895]: I0320 13:25:35.961464 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:25:35 crc kubenswrapper[4895]: I0320 13:25:35.985249 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:25:36 crc kubenswrapper[4895]: I0320 13:25:36.115362 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:25:36 crc kubenswrapper[4895]: I0320 13:25:36.912536 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.295552 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.402685 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.422815 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.455691 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.475971 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:25:37 crc kubenswrapper[4895]: I0320 13:25:37.944618 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.103544 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.387675 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.690942 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.758089 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.808305 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.828415 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:25:38 crc kubenswrapper[4895]: I0320 13:25:38.926303 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.259917 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.348833 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.357589 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.373877 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.420938 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.533900 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.572695 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.610717 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.726107 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.856270 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:25:39 crc kubenswrapper[4895]: I0320 13:25:39.866370 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.053538 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.082860 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.170316 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.174649 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.371194 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.425233 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.542055 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.699489 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.722561 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.812868 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.854532 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.916538 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:25:40 crc kubenswrapper[4895]: I0320 13:25:40.946190 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.225484 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.332726 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.346545 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.434375 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.460709 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.696670 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.763621 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.897536 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.901561 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:25:41 crc kubenswrapper[4895]: I0320 13:25:41.995886 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.008787 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.236087 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.291615 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.333158 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.366654 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.368938 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.478550 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.489542 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.513657 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.540084 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.569598 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.585766 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.687469 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.706508 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.708864 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.801321 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.824129 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:25:42 crc kubenswrapper[4895]: I0320 13:25:42.919744 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.014709 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.017781 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.017906 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.029152 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.146130 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.271571 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.303077 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.343616 4895 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.422285 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.452787 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.493827 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.533137 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.577909 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.580121 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.594957 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.845508 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.845629 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.865380 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.895729 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.913079 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.933091 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.943467 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.948729 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:25:43 crc kubenswrapper[4895]: I0320 13:25:43.986830 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.135876 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.353872 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.409669 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.451811 4895 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.483386 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.506158 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.544630 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.586041 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.665792 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.736399 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.814756 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.819883 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.983719 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:25:44 crc kubenswrapper[4895]: I0320 13:25:44.998678 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.002188 4895 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.002559 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.010513 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.009650075 podStartE2EDuration="40.009650075s" podCreationTimestamp="2026-03-20 13:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:24.896194634 +0000 UTC m=+224.405913620" watchObservedRunningTime="2026-03-20 13:25:45.009650075 +0000 UTC m=+244.519369081" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.012087 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/community-operators-2qwtb","openshift-authentication/oauth-openshift-558db77b4-q8wls"] Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.012848 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.012885 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c2f80ed-c929-4866-9df4-3513dec8b0d2" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.012254 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-xxqnz","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:25:45 crc kubenswrapper[4895]: E0320 13:25:45.013455 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="extract-utilities" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013490 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="extract-utilities" Mar 20 13:25:45 crc kubenswrapper[4895]: E0320 13:25:45.013522 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" containerName="installer" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013540 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" containerName="installer" Mar 20 13:25:45 crc kubenswrapper[4895]: E0320 13:25:45.013577 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a35300-9f9f-44c7-a1ff-d818032e001a" containerName="oauth-openshift" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013595 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a35300-9f9f-44c7-a1ff-d818032e001a" containerName="oauth-openshift" Mar 20 13:25:45 crc kubenswrapper[4895]: E0320 13:25:45.013623 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="extract-content" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013639 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="extract-content" Mar 20 13:25:45 crc kubenswrapper[4895]: E0320 13:25:45.013656 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="registry-server" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013669 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="registry-server" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013847 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" containerName="registry-server" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013880 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a35300-9f9f-44c7-a1ff-d818032e001a" containerName="oauth-openshift" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.013895 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c1431f-0be0-47f1-b86f-7a41d5015305" containerName="installer" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.014534 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.017173 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.018873 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.019121 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.019270 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.019383 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.019424 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.019999 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.020086 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.020135 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.020252 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.021003 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.021116 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.023204 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.029233 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.032150 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.046950 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.071762 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.071744561 podStartE2EDuration="20.071744561s" podCreationTimestamp="2026-03-20 13:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:45.068031886 +0000 UTC m=+244.577750852" watchObservedRunningTime="2026-03-20 13:25:45.071744561 +0000 UTC m=+244.581463527" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.111987 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.112265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.112368 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.112519 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.112780 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.112893 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72gt\" (UniqueName: \"kubernetes.io/projected/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-kube-api-access-f72gt\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113027 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-dir\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113102 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113254 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113325 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113427 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113519 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-policies\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.113630 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.142201 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.143618 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.143806 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.214642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-dir\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.214752 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-dir\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.215017 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.215337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.215555 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.215789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.215967 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.216160 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.216344 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-policies\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.216584 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.216799 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217036 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217554 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217775 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72gt\" (UniqueName: \"kubernetes.io/projected/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-kube-api-access-f72gt\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217927 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.216716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.217769 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-audit-policies\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.218843 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.224512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.225301 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.225843 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.226897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.227706 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.228549 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.228652 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.228829 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a35300-9f9f-44c7-a1ff-d818032e001a" path="/var/lib/kubelet/pods/64a35300-9f9f-44c7-a1ff-d818032e001a/volumes" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.230018 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71969a9-97c9-46c4-9e1c-051f3c86ae91" path="/var/lib/kubelet/pods/a71969a9-97c9-46c4-9e1c-051f3c86ae91/volumes" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.231010 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.244837 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72gt\" (UniqueName: \"kubernetes.io/projected/29a72351-3d8c-48b3-ab9f-cc2c9f8be995-kube-api-access-f72gt\") pod \"oauth-openshift-8ccb4757-xxqnz\" (UID: \"29a72351-3d8c-48b3-ab9f-cc2c9f8be995\") " pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.320160 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.337102 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.346438 4895 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.347753 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.353471 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.356344 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.425255 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.645627 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.647758 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.655152 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.663814 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.740973 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.747354 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.764153 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.792357 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.814889 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.833061 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.853678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.869408 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.920783 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:25:45 crc kubenswrapper[4895]: I0320 13:25:45.939416 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.008479 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.219213 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.265063 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.299738 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.458247 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.494726 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.538313 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.680571 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.769615 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.810075 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.830072 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.832958 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.870093 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.879010 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.887152 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.922122 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:25:46 crc kubenswrapper[4895]: I0320 13:25:46.975938 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.022487 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.037464 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.041063 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.058617 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.102743 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.137096 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.171285 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.181318 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.189498 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.219063 4895 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.227018 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.240109 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.244133 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.354292 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.409786 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.437175 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.440603 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.530379 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.551346 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.615295 4895 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.615552 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260" gracePeriod=5 Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.625312 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.696138 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.747992 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.783486 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.816432 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.835629 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:25:47 crc kubenswrapper[4895]: I0320 13:25:47.941472 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.032293 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.076446 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.165559 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.211682 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.215030 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-xxqnz"] Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.294993 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.301053 4895 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.348172 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.358950 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.371501 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.391632 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.418284 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.461989 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.501419 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.632937 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-xxqnz"] Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.650067 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.721411 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.756172 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.798002 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.847934 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.860499 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:25:48 crc kubenswrapper[4895]: I0320 13:25:48.957933 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.029883 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.062226 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.116027 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.145796 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.171461 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.348214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" event={"ID":"29a72351-3d8c-48b3-ab9f-cc2c9f8be995","Type":"ContainerStarted","Data":"eec3229a45b95e030673e76279b29268e1bf99e8a4fd29b512bba84a264a5b1a"} Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.348271 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" event={"ID":"29a72351-3d8c-48b3-ab9f-cc2c9f8be995","Type":"ContainerStarted","Data":"d6d51e1e01dfdfd3173c1a43bd8dac3036d31b3750f06102a9fa845254f19b0b"} Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.349652 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.386234 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" podStartSLOduration=55.386214402 podStartE2EDuration="55.386214402s" podCreationTimestamp="2026-03-20 13:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:49.382839305 +0000 UTC m=+248.892558341" watchObservedRunningTime="2026-03-20 13:25:49.386214402 +0000 UTC m=+248.895933378" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.388175 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.445511 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.505545 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.560349 4895 patch_prober.go:28] interesting pod/oauth-openshift-8ccb4757-xxqnz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": read tcp 10.217.0.2:51402->10.217.0.66:6443: read: connection reset by peer" start-of-body= Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.560436 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" podUID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": read tcp 10.217.0.2:51402->10.217.0.66:6443: read: connection reset by peer" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.720271 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.741842 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.762896 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.873886 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:25:49 crc kubenswrapper[4895]: I0320 13:25:49.874658 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.074545 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.230946 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.342797 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.356280 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-8ccb4757-xxqnz_29a72351-3d8c-48b3-ab9f-cc2c9f8be995/oauth-openshift/0.log" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.356334 4895 generic.go:334] "Generic (PLEG): container finished" podID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" containerID="eec3229a45b95e030673e76279b29268e1bf99e8a4fd29b512bba84a264a5b1a" exitCode=255 Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.356365 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" event={"ID":"29a72351-3d8c-48b3-ab9f-cc2c9f8be995","Type":"ContainerDied","Data":"eec3229a45b95e030673e76279b29268e1bf99e8a4fd29b512bba84a264a5b1a"} Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.357042 4895 scope.go:117] "RemoveContainer" containerID="eec3229a45b95e030673e76279b29268e1bf99e8a4fd29b512bba84a264a5b1a" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.779027 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.798371 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:25:50 crc kubenswrapper[4895]: I0320 13:25:50.918166 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.279687 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.299516 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.361845 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-8ccb4757-xxqnz_29a72351-3d8c-48b3-ab9f-cc2c9f8be995/oauth-openshift/1.log" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.362336 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-8ccb4757-xxqnz_29a72351-3d8c-48b3-ab9f-cc2c9f8be995/oauth-openshift/0.log" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.362366 4895 generic.go:334] "Generic (PLEG): container finished" podID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" containerID="f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b" exitCode=255 Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.362415 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" event={"ID":"29a72351-3d8c-48b3-ab9f-cc2c9f8be995","Type":"ContainerDied","Data":"f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b"} Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.362449 4895 scope.go:117] "RemoveContainer" containerID="eec3229a45b95e030673e76279b29268e1bf99e8a4fd29b512bba84a264a5b1a" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.362840 4895 scope.go:117] "RemoveContainer" containerID="f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b" Mar 20 13:25:51 crc kubenswrapper[4895]: E0320 13:25:51.363002 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-8ccb4757-xxqnz_openshift-authentication(29a72351-3d8c-48b3-ab9f-cc2c9f8be995)\"" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" podUID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.692698 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.746029 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.782559 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.807591 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.871103 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:25:51 crc kubenswrapper[4895]: I0320 13:25:51.948303 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.153178 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.206565 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.263168 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.297592 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.297881 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.376259 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-8ccb4757-xxqnz_29a72351-3d8c-48b3-ab9f-cc2c9f8be995/oauth-openshift/1.log" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.377886 4895 scope.go:117] "RemoveContainer" containerID="f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b" Mar 20 13:25:52 crc kubenswrapper[4895]: E0320 13:25:52.378226 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-8ccb4757-xxqnz_openshift-authentication(29a72351-3d8c-48b3-ab9f-cc2c9f8be995)\"" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" podUID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.477514 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:25:52 crc kubenswrapper[4895]: I0320 13:25:52.581365 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.082349 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.158168 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.229484 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.229562 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.339620 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.339971 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340122 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340424 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.339778 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340211 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340522 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.340979 4895 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.341087 4895 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.341190 4895 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.341282 4895 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.350431 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.384080 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.384134 4895 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260" exitCode=137 Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.384176 4895 scope.go:117] "RemoveContainer" containerID="ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.384341 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.405116 4895 scope.go:117] "RemoveContainer" containerID="ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260" Mar 20 13:25:53 crc kubenswrapper[4895]: E0320 13:25:53.405733 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260\": container with ID starting with ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260 not found: ID does not exist" containerID="ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.405784 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260"} err="failed to get container status \"ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260\": rpc error: code = NotFound desc = could not find container \"ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260\": container with ID starting with ae9cfd9026b9f3f081bb94a5b1b8a1297e079cbdd22bebf9099147e4fe025260 not found: ID does not exist" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.419516 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.442439 4895 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:53 crc kubenswrapper[4895]: I0320 13:25:53.878915 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:25:54 crc kubenswrapper[4895]: I0320 13:25:54.001299 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:25:54 crc kubenswrapper[4895]: I0320 13:25:54.318135 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:25:54 crc kubenswrapper[4895]: I0320 13:25:54.562586 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:25:54 crc kubenswrapper[4895]: I0320 13:25:54.840383 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.221039 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.221327 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.234368 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.234422 4895 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="365dbd0c-8922-42e3-8668-64b1ab79e248" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.241168 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.241217 4895 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="365dbd0c-8922-42e3-8668-64b1ab79e248" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.357141 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.357218 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:25:55 crc kubenswrapper[4895]: I0320 13:25:55.357990 4895 scope.go:117] "RemoveContainer" containerID="f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b" Mar 20 13:25:55 crc kubenswrapper[4895]: E0320 13:25:55.358552 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-8ccb4757-xxqnz_openshift-authentication(29a72351-3d8c-48b3-ab9f-cc2c9f8be995)\"" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" podUID="29a72351-3d8c-48b3-ab9f-cc2c9f8be995" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.190632 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566886-bd6nz"] Mar 20 13:26:00 crc kubenswrapper[4895]: E0320 13:26:00.191295 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.191315 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.191507 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.192078 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.196079 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.196301 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.197010 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.205500 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-bd6nz"] Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.232995 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6m9j\" (UniqueName: \"kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j\") pod \"auto-csr-approver-29566886-bd6nz\" (UID: \"118ecc82-ea27-4675-8206-aa7457215e80\") " pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.333880 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6m9j\" (UniqueName: \"kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j\") pod \"auto-csr-approver-29566886-bd6nz\" (UID: \"118ecc82-ea27-4675-8206-aa7457215e80\") " pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.354841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6m9j\" (UniqueName: \"kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j\") pod \"auto-csr-approver-29566886-bd6nz\" (UID: \"118ecc82-ea27-4675-8206-aa7457215e80\") " pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.519131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:00 crc kubenswrapper[4895]: I0320 13:26:00.791649 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-bd6nz"] Mar 20 13:26:00 crc kubenswrapper[4895]: W0320 13:26:00.794311 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod118ecc82_ea27_4675_8206_aa7457215e80.slice/crio-ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536 WatchSource:0}: Error finding container ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536: Status 404 returned error can't find the container with id ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536 Mar 20 13:26:01 crc kubenswrapper[4895]: I0320 13:26:01.449553 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" event={"ID":"118ecc82-ea27-4675-8206-aa7457215e80","Type":"ContainerStarted","Data":"ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536"} Mar 20 13:26:02 crc kubenswrapper[4895]: I0320 13:26:02.456803 4895 generic.go:334] "Generic (PLEG): container finished" podID="118ecc82-ea27-4675-8206-aa7457215e80" containerID="916b91cf254b8986c424481ee31a145cc5d343063629868bef5c7ca71df1fdfc" exitCode=0 Mar 20 13:26:02 crc kubenswrapper[4895]: I0320 13:26:02.456862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" event={"ID":"118ecc82-ea27-4675-8206-aa7457215e80","Type":"ContainerDied","Data":"916b91cf254b8986c424481ee31a145cc5d343063629868bef5c7ca71df1fdfc"} Mar 20 13:26:03 crc kubenswrapper[4895]: I0320 13:26:03.791566 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:03 crc kubenswrapper[4895]: I0320 13:26:03.883494 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6m9j\" (UniqueName: \"kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j\") pod \"118ecc82-ea27-4675-8206-aa7457215e80\" (UID: \"118ecc82-ea27-4675-8206-aa7457215e80\") " Mar 20 13:26:03 crc kubenswrapper[4895]: I0320 13:26:03.890464 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j" (OuterVolumeSpecName: "kube-api-access-h6m9j") pod "118ecc82-ea27-4675-8206-aa7457215e80" (UID: "118ecc82-ea27-4675-8206-aa7457215e80"). InnerVolumeSpecName "kube-api-access-h6m9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:03 crc kubenswrapper[4895]: I0320 13:26:03.985161 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6m9j\" (UniqueName: \"kubernetes.io/projected/118ecc82-ea27-4675-8206-aa7457215e80-kube-api-access-h6m9j\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:04 crc kubenswrapper[4895]: I0320 13:26:04.473569 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" event={"ID":"118ecc82-ea27-4675-8206-aa7457215e80","Type":"ContainerDied","Data":"ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536"} Mar 20 13:26:04 crc kubenswrapper[4895]: I0320 13:26:04.473618 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-bd6nz" Mar 20 13:26:04 crc kubenswrapper[4895]: I0320 13:26:04.473649 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba57d1f97be962f09ccf5c8a021686daae408edefda1200d9634c5232fab5536" Mar 20 13:26:10 crc kubenswrapper[4895]: I0320 13:26:10.211821 4895 scope.go:117] "RemoveContainer" containerID="f1aa5253b651b63f263ab98da1c4f1e2bde71dacd047b2d2aa7eaf0ab7c8ae3b" Mar 20 13:26:11 crc kubenswrapper[4895]: I0320 13:26:11.524995 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-8ccb4757-xxqnz_29a72351-3d8c-48b3-ab9f-cc2c9f8be995/oauth-openshift/1.log" Mar 20 13:26:11 crc kubenswrapper[4895]: I0320 13:26:11.525613 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" event={"ID":"29a72351-3d8c-48b3-ab9f-cc2c9f8be995","Type":"ContainerStarted","Data":"64ab63231e1ee0b331c52eae1bdd1cb7fc8857b6e70f618a103741210e4655ae"} Mar 20 13:26:11 crc kubenswrapper[4895]: I0320 13:26:11.526726 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:26:11 crc kubenswrapper[4895]: I0320 13:26:11.532667 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8ccb4757-xxqnz" Mar 20 13:26:13 crc kubenswrapper[4895]: E0320 13:26:13.906702 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94398e3b_a910_4cd4_bb8a_2e599d39e8e4.slice/crio-conmon-c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:26:14 crc kubenswrapper[4895]: I0320 13:26:14.549663 4895 generic.go:334] "Generic (PLEG): container finished" podID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerID="c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f" exitCode=0 Mar 20 13:26:14 crc kubenswrapper[4895]: I0320 13:26:14.549729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerDied","Data":"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f"} Mar 20 13:26:14 crc kubenswrapper[4895]: I0320 13:26:14.550334 4895 scope.go:117] "RemoveContainer" containerID="c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f" Mar 20 13:26:15 crc kubenswrapper[4895]: I0320 13:26:15.559492 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerStarted","Data":"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b"} Mar 20 13:26:15 crc kubenswrapper[4895]: I0320 13:26:15.561294 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:26:15 crc kubenswrapper[4895]: I0320 13:26:15.564804 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.297245 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.297681 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.297755 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.298552 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.298634 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa" gracePeriod=600 Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.966967 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa" exitCode=0 Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.967077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa"} Mar 20 13:26:22 crc kubenswrapper[4895]: I0320 13:26:22.967371 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2"} Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.775598 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mb4sm"] Mar 20 13:27:05 crc kubenswrapper[4895]: E0320 13:27:05.776298 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118ecc82-ea27-4675-8206-aa7457215e80" containerName="oc" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.776311 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="118ecc82-ea27-4675-8206-aa7457215e80" containerName="oc" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.776426 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="118ecc82-ea27-4675-8206-aa7457215e80" containerName="oc" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.776761 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.789701 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mb4sm"] Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910367 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-registry-tls\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afa68578-8238-4d03-9d4e-910814d7689b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910590 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-trusted-ca\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910654 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910711 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-registry-certificates\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910740 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-bound-sa-token\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910798 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbhx5\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-kube-api-access-bbhx5\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.910831 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afa68578-8238-4d03-9d4e-910814d7689b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:05 crc kubenswrapper[4895]: I0320 13:27:05.932870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.012821 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-trusted-ca\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-registry-certificates\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-bound-sa-token\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013622 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbhx5\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-kube-api-access-bbhx5\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013662 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afa68578-8238-4d03-9d4e-910814d7689b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-registry-tls\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.013772 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afa68578-8238-4d03-9d4e-910814d7689b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.014523 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-trusted-ca\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.014911 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/afa68578-8238-4d03-9d4e-910814d7689b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.015640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/afa68578-8238-4d03-9d4e-910814d7689b-registry-certificates\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.022915 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/afa68578-8238-4d03-9d4e-910814d7689b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.025666 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-registry-tls\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.035364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbhx5\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-kube-api-access-bbhx5\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.036871 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afa68578-8238-4d03-9d4e-910814d7689b-bound-sa-token\") pod \"image-registry-66df7c8f76-mb4sm\" (UID: \"afa68578-8238-4d03-9d4e-910814d7689b\") " pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.096505 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:06 crc kubenswrapper[4895]: I0320 13:27:06.496077 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mb4sm"] Mar 20 13:27:07 crc kubenswrapper[4895]: I0320 13:27:07.242682 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" event={"ID":"afa68578-8238-4d03-9d4e-910814d7689b","Type":"ContainerStarted","Data":"3a5940e8702274f4ca92cc7a62a667dec6ccb851c50f0950c6acb00ca89e92e6"} Mar 20 13:27:07 crc kubenswrapper[4895]: I0320 13:27:07.242751 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" event={"ID":"afa68578-8238-4d03-9d4e-910814d7689b","Type":"ContainerStarted","Data":"5eda46c49966d960ca05e8dc64602ac1d411dcf7019ff6cbe6cc372e91326ec0"} Mar 20 13:27:07 crc kubenswrapper[4895]: I0320 13:27:07.243668 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:07 crc kubenswrapper[4895]: I0320 13:27:07.267496 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" podStartSLOduration=2.267481143 podStartE2EDuration="2.267481143s" podCreationTimestamp="2026-03-20 13:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:07.266561052 +0000 UTC m=+326.776280018" watchObservedRunningTime="2026-03-20 13:27:07.267481143 +0000 UTC m=+326.777200099" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.300549 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.303684 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tdcr2" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="registry-server" containerID="cri-o://cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.311653 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.312041 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpm8f" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="registry-server" containerID="cri-o://9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.326710 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.327036 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" containerID="cri-o://80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.339714 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.340096 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gn8m7" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="registry-server" containerID="cri-o://0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.346090 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q72qk"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.346986 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.362366 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.362679 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6z86w" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="registry-server" containerID="cri-o://6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.366232 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q72qk"] Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.466019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.466414 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khr46\" (UniqueName: \"kubernetes.io/projected/8d7099f0-1367-48df-962f-2a7d34147dc9-kube-api-access-khr46\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.466464 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.567761 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khr46\" (UniqueName: \"kubernetes.io/projected/8d7099f0-1367-48df-962f-2a7d34147dc9-kube-api-access-khr46\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.567837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.567882 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.569091 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.583945 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8d7099f0-1367-48df-962f-2a7d34147dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.584478 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khr46\" (UniqueName: \"kubernetes.io/projected/8d7099f0-1367-48df-962f-2a7d34147dc9-kube-api-access-khr46\") pod \"marketplace-operator-79b997595-q72qk\" (UID: \"8d7099f0-1367-48df-962f-2a7d34147dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.672922 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.766293 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.787521 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.834663 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.838419 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.847588 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.875091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cw9j\" (UniqueName: \"kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j\") pod \"c9cbc624-2052-45bd-9d34-9cb03e70343c\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.876048 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content\") pod \"c9cbc624-2052-45bd-9d34-9cb03e70343c\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.876126 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities\") pod \"c9cbc624-2052-45bd-9d34-9cb03e70343c\" (UID: \"c9cbc624-2052-45bd-9d34-9cb03e70343c\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.877592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities" (OuterVolumeSpecName: "utilities") pod "c9cbc624-2052-45bd-9d34-9cb03e70343c" (UID: "c9cbc624-2052-45bd-9d34-9cb03e70343c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.879300 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j" (OuterVolumeSpecName: "kube-api-access-6cw9j") pod "c9cbc624-2052-45bd-9d34-9cb03e70343c" (UID: "c9cbc624-2052-45bd-9d34-9cb03e70343c"). InnerVolumeSpecName "kube-api-access-6cw9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.924237 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9cbc624-2052-45bd-9d34-9cb03e70343c" (UID: "c9cbc624-2052-45bd-9d34-9cb03e70343c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977267 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca\") pod \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities\") pod \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977435 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzpdt\" (UniqueName: \"kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt\") pod \"485a7267-c39b-4b1e-95b1-075e868421ed\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977464 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghwhb\" (UniqueName: \"kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb\") pod \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977507 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics\") pod \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977543 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities\") pod \"485a7267-c39b-4b1e-95b1-075e868421ed\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content\") pod \"485a7267-c39b-4b1e-95b1-075e868421ed\" (UID: \"485a7267-c39b-4b1e-95b1-075e868421ed\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977609 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content\") pod \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbbwx\" (UniqueName: \"kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx\") pod \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\" (UID: \"94398e3b-a910-4cd4-bb8a-2e599d39e8e4\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977744 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities\") pod \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977766 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content\") pod \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\" (UID: \"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977800 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvrcl\" (UniqueName: \"kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl\") pod \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\" (UID: \"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262\") " Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.977974 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "94398e3b-a910-4cd4-bb8a-2e599d39e8e4" (UID: "94398e3b-a910-4cd4-bb8a-2e599d39e8e4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.978701 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cw9j\" (UniqueName: \"kubernetes.io/projected/c9cbc624-2052-45bd-9d34-9cb03e70343c-kube-api-access-6cw9j\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.978725 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.978733 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.978743 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cbc624-2052-45bd-9d34-9cb03e70343c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.978932 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities" (OuterVolumeSpecName: "utilities") pod "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" (UID: "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.979276 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities" (OuterVolumeSpecName: "utilities") pod "485a7267-c39b-4b1e-95b1-075e868421ed" (UID: "485a7267-c39b-4b1e-95b1-075e868421ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.979858 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities" (OuterVolumeSpecName: "utilities") pod "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" (UID: "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.981123 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx" (OuterVolumeSpecName: "kube-api-access-jbbwx") pod "94398e3b-a910-4cd4-bb8a-2e599d39e8e4" (UID: "94398e3b-a910-4cd4-bb8a-2e599d39e8e4"). InnerVolumeSpecName "kube-api-access-jbbwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.981359 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl" (OuterVolumeSpecName: "kube-api-access-lvrcl") pod "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" (UID: "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262"). InnerVolumeSpecName "kube-api-access-lvrcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.981381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt" (OuterVolumeSpecName: "kube-api-access-hzpdt") pod "485a7267-c39b-4b1e-95b1-075e868421ed" (UID: "485a7267-c39b-4b1e-95b1-075e868421ed"). InnerVolumeSpecName "kube-api-access-hzpdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.982946 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb" (OuterVolumeSpecName: "kube-api-access-ghwhb") pod "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" (UID: "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b"). InnerVolumeSpecName "kube-api-access-ghwhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4895]: I0320 13:27:23.983244 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "94398e3b-a910-4cd4-bb8a-2e599d39e8e4" (UID: "94398e3b-a910-4cd4-bb8a-2e599d39e8e4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.015552 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" (UID: "f7d9f9c9-84fa-40b3-95fe-dd2f821c1262"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.030787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" (UID: "4e70e99c-ccbe-4290-ad2e-20f42e5bde4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079755 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079785 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbbwx\" (UniqueName: \"kubernetes.io/projected/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-kube-api-access-jbbwx\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079796 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079805 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079813 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvrcl\" (UniqueName: \"kubernetes.io/projected/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262-kube-api-access-lvrcl\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079821 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079828 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzpdt\" (UniqueName: \"kubernetes.io/projected/485a7267-c39b-4b1e-95b1-075e868421ed-kube-api-access-hzpdt\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079836 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghwhb\" (UniqueName: \"kubernetes.io/projected/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b-kube-api-access-ghwhb\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079846 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94398e3b-a910-4cd4-bb8a-2e599d39e8e4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.079854 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.105506 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q72qk"] Mar 20 13:27:24 crc kubenswrapper[4895]: W0320 13:27:24.109430 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d7099f0_1367_48df_962f_2a7d34147dc9.slice/crio-9212c19cfd0930ad103a6bcf6a39e2eacf36eebe6927c5aeded58ada50c56efc WatchSource:0}: Error finding container 9212c19cfd0930ad103a6bcf6a39e2eacf36eebe6927c5aeded58ada50c56efc: Status 404 returned error can't find the container with id 9212c19cfd0930ad103a6bcf6a39e2eacf36eebe6927c5aeded58ada50c56efc Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.110273 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "485a7267-c39b-4b1e-95b1-075e868421ed" (UID: "485a7267-c39b-4b1e-95b1-075e868421ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.120576 4895 generic.go:334] "Generic (PLEG): container finished" podID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerID="0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.120635 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn8m7" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.120681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerDied","Data":"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.120748 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn8m7" event={"ID":"f7d9f9c9-84fa-40b3-95fe-dd2f821c1262","Type":"ContainerDied","Data":"8b81a0a1feb8d5d2e1b34f5bf7cd7152aba9b373e841540c9de7bbb6b72efd29"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.120767 4895 scope.go:117] "RemoveContainer" containerID="0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.123357 4895 generic.go:334] "Generic (PLEG): container finished" podID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerID="9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.123414 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerDied","Data":"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.123430 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpm8f" event={"ID":"4e70e99c-ccbe-4290-ad2e-20f42e5bde4b","Type":"ContainerDied","Data":"fa42e903ce40667fbdc0d5481bb3c1b35d83111daccd5873f825cee9c59a5b9d"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.123496 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpm8f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.128905 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" event={"ID":"8d7099f0-1367-48df-962f-2a7d34147dc9","Type":"ContainerStarted","Data":"9212c19cfd0930ad103a6bcf6a39e2eacf36eebe6927c5aeded58ada50c56efc"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.138191 4895 generic.go:334] "Generic (PLEG): container finished" podID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerID="80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.138232 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.138286 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerDied","Data":"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.138322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlxvz" event={"ID":"94398e3b-a910-4cd4-bb8a-2e599d39e8e4","Type":"ContainerDied","Data":"a52c69da1cc75b903adb00b8413f03f317674af75f9abf988a9d263e4b8f8c7a"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.161375 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.162681 4895 scope.go:117] "RemoveContainer" containerID="aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.164706 4895 generic.go:334] "Generic (PLEG): container finished" podID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerID="cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.164872 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdcr2" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.170517 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerDied","Data":"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.170564 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn8m7"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.170612 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdcr2" event={"ID":"c9cbc624-2052-45bd-9d34-9cb03e70343c","Type":"ContainerDied","Data":"b370d16710485612953694882ed525e618ca6e1a017b56efbed37add63e7706f"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.176597 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.178157 4895 generic.go:334] "Generic (PLEG): container finished" podID="485a7267-c39b-4b1e-95b1-075e868421ed" containerID="6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.178212 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerDied","Data":"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.178235 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z86w" event={"ID":"485a7267-c39b-4b1e-95b1-075e868421ed","Type":"ContainerDied","Data":"9431a6953ef5e0b1f45db6c0779276d8502cc96cdb2ea3a5113fc9126f89c67a"} Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.178367 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z86w" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.181489 4895 scope.go:117] "RemoveContainer" containerID="beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.182630 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485a7267-c39b-4b1e-95b1-075e868421ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.187954 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpm8f"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.194783 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.208497 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlxvz"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.216341 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.216457 4895 scope.go:117] "RemoveContainer" containerID="0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.216870 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141\": container with ID starting with 0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141 not found: ID does not exist" containerID="0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.216898 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141"} err="failed to get container status \"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141\": rpc error: code = NotFound desc = could not find container \"0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141\": container with ID starting with 0db6cde8d5f381a1de3886159e18e18ae13f4028b29f44c4533cf5392e9c6141 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.216939 4895 scope.go:117] "RemoveContainer" containerID="aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.217183 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24\": container with ID starting with aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24 not found: ID does not exist" containerID="aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.217207 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24"} err="failed to get container status \"aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24\": rpc error: code = NotFound desc = could not find container \"aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24\": container with ID starting with aae1b798ae0dd15c307d856c8b2437bcfadd5eb1983fb606f1dd41f2bb390a24 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.217221 4895 scope.go:117] "RemoveContainer" containerID="beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.217542 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31\": container with ID starting with beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31 not found: ID does not exist" containerID="beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.217562 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31"} err="failed to get container status \"beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31\": rpc error: code = NotFound desc = could not find container \"beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31\": container with ID starting with beeb7d17ca4c5259470708c1b6de599780f7202e564c08f82fb988775909de31 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.217576 4895 scope.go:117] "RemoveContainer" containerID="9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.219456 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tdcr2"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.222703 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.225762 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6z86w"] Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.228614 4895 scope.go:117] "RemoveContainer" containerID="e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.247675 4895 scope.go:117] "RemoveContainer" containerID="e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.268593 4895 scope.go:117] "RemoveContainer" containerID="9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.269107 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9\": container with ID starting with 9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9 not found: ID does not exist" containerID="9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.269150 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9"} err="failed to get container status \"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9\": rpc error: code = NotFound desc = could not find container \"9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9\": container with ID starting with 9790ad0b5145dd82791d76e54eff7f9cc4c77a3093ce582c1dbf39c39ef4a8d9 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.269175 4895 scope.go:117] "RemoveContainer" containerID="e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.269906 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b\": container with ID starting with e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b not found: ID does not exist" containerID="e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.269944 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b"} err="failed to get container status \"e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b\": rpc error: code = NotFound desc = could not find container \"e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b\": container with ID starting with e5afe95bfb15b0ea9995238435b7f77827dfd6124a52321d70eefcdd6f314b3b not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.269972 4895 scope.go:117] "RemoveContainer" containerID="e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.270343 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f\": container with ID starting with e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f not found: ID does not exist" containerID="e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.270368 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f"} err="failed to get container status \"e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f\": rpc error: code = NotFound desc = could not find container \"e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f\": container with ID starting with e5097ebb1f835a44732bc7ee87a8b416027905d76a205e1ceee5cfb3bf1cd48f not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.270383 4895 scope.go:117] "RemoveContainer" containerID="80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.285676 4895 scope.go:117] "RemoveContainer" containerID="c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.300053 4895 scope.go:117] "RemoveContainer" containerID="80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.300323 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b\": container with ID starting with 80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b not found: ID does not exist" containerID="80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.300365 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b"} err="failed to get container status \"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b\": rpc error: code = NotFound desc = could not find container \"80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b\": container with ID starting with 80138e714bad6bbf90d2bef89ee35c22c023e8aa6230f758a0d3b280f045526b not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.300419 4895 scope.go:117] "RemoveContainer" containerID="c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.300665 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f\": container with ID starting with c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f not found: ID does not exist" containerID="c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.300693 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f"} err="failed to get container status \"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f\": rpc error: code = NotFound desc = could not find container \"c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f\": container with ID starting with c6cd2a524ac6a0132230f7903559a6c94e4ee55a57be3cd72136bd5b1a24cc1f not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.300729 4895 scope.go:117] "RemoveContainer" containerID="cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.332305 4895 scope.go:117] "RemoveContainer" containerID="cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.346801 4895 scope.go:117] "RemoveContainer" containerID="4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.360879 4895 scope.go:117] "RemoveContainer" containerID="cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.361794 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e\": container with ID starting with cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e not found: ID does not exist" containerID="cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.361833 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e"} err="failed to get container status \"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e\": rpc error: code = NotFound desc = could not find container \"cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e\": container with ID starting with cb06c4a732c39e542d93d2a7775abf132d0075732875a33cf5c2ff0596e1540e not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.361860 4895 scope.go:117] "RemoveContainer" containerID="cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.362290 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c\": container with ID starting with cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c not found: ID does not exist" containerID="cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.362357 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c"} err="failed to get container status \"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c\": rpc error: code = NotFound desc = could not find container \"cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c\": container with ID starting with cc10885eb76a1dafac812381928ccc92aaaa578638c34839a5fe11c691b3a25c not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.362426 4895 scope.go:117] "RemoveContainer" containerID="4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.362805 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a\": container with ID starting with 4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a not found: ID does not exist" containerID="4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.362829 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a"} err="failed to get container status \"4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a\": rpc error: code = NotFound desc = could not find container \"4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a\": container with ID starting with 4edbb35a65fa692dbe6facce1afe6149b8a963c39d2749ba8350a6af7daf6a1a not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.362846 4895 scope.go:117] "RemoveContainer" containerID="6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.377818 4895 scope.go:117] "RemoveContainer" containerID="635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.392784 4895 scope.go:117] "RemoveContainer" containerID="99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.411140 4895 scope.go:117] "RemoveContainer" containerID="6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.411468 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f\": container with ID starting with 6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f not found: ID does not exist" containerID="6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.411491 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f"} err="failed to get container status \"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f\": rpc error: code = NotFound desc = could not find container \"6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f\": container with ID starting with 6b799a67bb4c943a3d17c450814078474f44d3ef7a0757adb9ff3152e922b97f not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.411511 4895 scope.go:117] "RemoveContainer" containerID="635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.411778 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407\": container with ID starting with 635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407 not found: ID does not exist" containerID="635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.411793 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407"} err="failed to get container status \"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407\": rpc error: code = NotFound desc = could not find container \"635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407\": container with ID starting with 635e045a76b8bf952591101a5e43e1e339fc3a3d29ab1c8808d063f314291407 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.411805 4895 scope.go:117] "RemoveContainer" containerID="99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e" Mar 20 13:27:24 crc kubenswrapper[4895]: E0320 13:27:24.412067 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e\": container with ID starting with 99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e not found: ID does not exist" containerID="99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e" Mar 20 13:27:24 crc kubenswrapper[4895]: I0320 13:27:24.412102 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e"} err="failed to get container status \"99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e\": rpc error: code = NotFound desc = could not find container \"99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e\": container with ID starting with 99ad070999ed0c27b4d217c97f26a51bfafe1e67a9a6c055afae4c1e9efc9e5e not found: ID does not exist" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.185381 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" event={"ID":"8d7099f0-1367-48df-962f-2a7d34147dc9","Type":"ContainerStarted","Data":"0441580ce8157635e53f7512db72edc8688a6d0f8ba669548b2211e68913ed45"} Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.185732 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.188656 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.201741 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q72qk" podStartSLOduration=2.2017189679999998 podStartE2EDuration="2.201718968s" podCreationTimestamp="2026-03-20 13:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:25.201311508 +0000 UTC m=+344.711030474" watchObservedRunningTime="2026-03-20 13:27:25.201718968 +0000 UTC m=+344.711437934" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.225697 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" path="/var/lib/kubelet/pods/485a7267-c39b-4b1e-95b1-075e868421ed/volumes" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.226639 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" path="/var/lib/kubelet/pods/4e70e99c-ccbe-4290-ad2e-20f42e5bde4b/volumes" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.227232 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" path="/var/lib/kubelet/pods/94398e3b-a910-4cd4-bb8a-2e599d39e8e4/volumes" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.228210 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" path="/var/lib/kubelet/pods/c9cbc624-2052-45bd-9d34-9cb03e70343c/volumes" Mar 20 13:27:25 crc kubenswrapper[4895]: I0320 13:27:25.228844 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" path="/var/lib/kubelet/pods/f7d9f9c9-84fa-40b3-95fe-dd2f821c1262/volumes" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043251 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jk92b"] Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043492 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043507 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043524 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043532 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043544 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043553 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043566 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043574 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043584 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043592 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043602 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043610 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043620 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043628 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043638 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043646 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043657 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043666 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="extract-content" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043678 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043687 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043700 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043712 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043722 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043730 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="extract-utilities" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.043739 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043748 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043864 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d9f9c9-84fa-40b3-95fe-dd2f821c1262" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043879 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e70e99c-ccbe-4290-ad2e-20f42e5bde4b" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043892 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="485a7267-c39b-4b1e-95b1-075e868421ed" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043906 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043917 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.043931 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cbc624-2052-45bd-9d34-9cb03e70343c" containerName="registry-server" Mar 20 13:27:26 crc kubenswrapper[4895]: E0320 13:27:26.044036 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.044045 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="94398e3b-a910-4cd4-bb8a-2e599d39e8e4" containerName="marketplace-operator" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.044772 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.047603 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.062183 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jk92b"] Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.101947 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mb4sm" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.174414 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.208816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww5p\" (UniqueName: \"kubernetes.io/projected/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-kube-api-access-kww5p\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.208904 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-utilities\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.208935 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-catalog-content\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.310102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kww5p\" (UniqueName: \"kubernetes.io/projected/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-kube-api-access-kww5p\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.310154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-utilities\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.310172 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-catalog-content\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.310591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-catalog-content\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.311605 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-utilities\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.332672 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww5p\" (UniqueName: \"kubernetes.io/projected/602120eb-39e8-4b29-a2f9-7ff1fe5a0222-kube-api-access-kww5p\") pod \"redhat-operators-jk92b\" (UID: \"602120eb-39e8-4b29-a2f9-7ff1fe5a0222\") " pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.412148 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.676989 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.678483 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.680465 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jk92b"] Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.684108 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.712438 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.815205 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.815306 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.815418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566bc\" (UniqueName: \"kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.917319 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.917437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566bc\" (UniqueName: \"kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.917538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.917817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.918184 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:26 crc kubenswrapper[4895]: I0320 13:27:26.936082 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566bc\" (UniqueName: \"kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc\") pod \"community-operators-mng7h\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:27 crc kubenswrapper[4895]: I0320 13:27:27.044241 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:27 crc kubenswrapper[4895]: I0320 13:27:27.205014 4895 generic.go:334] "Generic (PLEG): container finished" podID="602120eb-39e8-4b29-a2f9-7ff1fe5a0222" containerID="184be01c28b2e8742c58063928c4b621cbe3be0fbdfc48cb79eaeb930f44a051" exitCode=0 Mar 20 13:27:27 crc kubenswrapper[4895]: I0320 13:27:27.206332 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk92b" event={"ID":"602120eb-39e8-4b29-a2f9-7ff1fe5a0222","Type":"ContainerDied","Data":"184be01c28b2e8742c58063928c4b621cbe3be0fbdfc48cb79eaeb930f44a051"} Mar 20 13:27:27 crc kubenswrapper[4895]: I0320 13:27:27.206357 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk92b" event={"ID":"602120eb-39e8-4b29-a2f9-7ff1fe5a0222","Type":"ContainerStarted","Data":"e0a23ede6b8fa3a22c3d470bc61899cee6251bb86fd3f93c28dc2ed5c87ce35b"} Mar 20 13:27:27 crc kubenswrapper[4895]: I0320 13:27:27.317989 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.218109 4895 generic.go:334] "Generic (PLEG): container finished" podID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerID="4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b" exitCode=0 Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.218160 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerDied","Data":"4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b"} Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.218666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerStarted","Data":"440a213fcabf35bcb4c85b95d4ff15f82bb7a13cfb04f4f413d0c31064699eef"} Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.644563 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.647204 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.649711 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.663302 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.748352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.748450 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9zsb\" (UniqueName: \"kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.748553 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.849431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.849519 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.849550 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9zsb\" (UniqueName: \"kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.850257 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.850368 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.868927 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9zsb\" (UniqueName: \"kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb\") pod \"certified-operators-rx9gl\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:28 crc kubenswrapper[4895]: I0320 13:27:28.963710 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.197561 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.244070 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerStarted","Data":"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13"} Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.259443 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xnqrs"] Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.260672 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.264569 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.268664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk92b" event={"ID":"602120eb-39e8-4b29-a2f9-7ff1fe5a0222","Type":"ContainerStarted","Data":"a420de8b55d3386f1f578b61ac91d9151ce5830b4dbe0f3fb9b7a2dd143f3045"} Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.270832 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerStarted","Data":"45bd137c537a496b88dc46efad4ef71a8a4173460c4b30f43ca757790f24a40b"} Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.271869 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnqrs"] Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.357484 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-utilities\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.357522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhrh\" (UniqueName: \"kubernetes.io/projected/5e11a404-9171-42bf-83c6-341c3db05c44-kube-api-access-blhrh\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.357571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-catalog-content\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.458567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-catalog-content\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.458684 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-utilities\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.458706 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhrh\" (UniqueName: \"kubernetes.io/projected/5e11a404-9171-42bf-83c6-341c3db05c44-kube-api-access-blhrh\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.459126 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-utilities\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.459149 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11a404-9171-42bf-83c6-341c3db05c44-catalog-content\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.477208 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhrh\" (UniqueName: \"kubernetes.io/projected/5e11a404-9171-42bf-83c6-341c3db05c44-kube-api-access-blhrh\") pod \"redhat-marketplace-xnqrs\" (UID: \"5e11a404-9171-42bf-83c6-341c3db05c44\") " pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.674825 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:29 crc kubenswrapper[4895]: I0320 13:27:29.903831 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xnqrs"] Mar 20 13:27:29 crc kubenswrapper[4895]: W0320 13:27:29.912472 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e11a404_9171_42bf_83c6_341c3db05c44.slice/crio-dd0fb55196345fe10b6d69bd33843f990beb8d13013fda01b4462330b860e70f WatchSource:0}: Error finding container dd0fb55196345fe10b6d69bd33843f990beb8d13013fda01b4462330b860e70f: Status 404 returned error can't find the container with id dd0fb55196345fe10b6d69bd33843f990beb8d13013fda01b4462330b860e70f Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.279502 4895 generic.go:334] "Generic (PLEG): container finished" podID="602120eb-39e8-4b29-a2f9-7ff1fe5a0222" containerID="a420de8b55d3386f1f578b61ac91d9151ce5830b4dbe0f3fb9b7a2dd143f3045" exitCode=0 Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.279546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk92b" event={"ID":"602120eb-39e8-4b29-a2f9-7ff1fe5a0222","Type":"ContainerDied","Data":"a420de8b55d3386f1f578b61ac91d9151ce5830b4dbe0f3fb9b7a2dd143f3045"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.279907 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk92b" event={"ID":"602120eb-39e8-4b29-a2f9-7ff1fe5a0222","Type":"ContainerStarted","Data":"12f2580da5a294efd918bea3588dbcafbedaa529e09adef1ea3e229bb4f63b05"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.281860 4895 generic.go:334] "Generic (PLEG): container finished" podID="5e11a404-9171-42bf-83c6-341c3db05c44" containerID="737a08643a93baa22d0ca275a9a9b2834bfef14cec459552a4fc8c48a94ba0f3" exitCode=0 Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.281919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnqrs" event={"ID":"5e11a404-9171-42bf-83c6-341c3db05c44","Type":"ContainerDied","Data":"737a08643a93baa22d0ca275a9a9b2834bfef14cec459552a4fc8c48a94ba0f3"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.281943 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnqrs" event={"ID":"5e11a404-9171-42bf-83c6-341c3db05c44","Type":"ContainerStarted","Data":"dd0fb55196345fe10b6d69bd33843f990beb8d13013fda01b4462330b860e70f"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.284987 4895 generic.go:334] "Generic (PLEG): container finished" podID="539ac798-b3af-410c-9e5e-4e45263b692b" containerID="cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6" exitCode=0 Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.285243 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerDied","Data":"cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.294144 4895 generic.go:334] "Generic (PLEG): container finished" podID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerID="875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13" exitCode=0 Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.294184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerDied","Data":"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13"} Mar 20 13:27:30 crc kubenswrapper[4895]: I0320 13:27:30.307590 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jk92b" podStartSLOduration=1.777551537 podStartE2EDuration="4.307572672s" podCreationTimestamp="2026-03-20 13:27:26 +0000 UTC" firstStartedPulling="2026-03-20 13:27:27.207545703 +0000 UTC m=+346.717264689" lastFinishedPulling="2026-03-20 13:27:29.737566818 +0000 UTC m=+349.247285824" observedRunningTime="2026-03-20 13:27:30.304153926 +0000 UTC m=+349.813872892" watchObservedRunningTime="2026-03-20 13:27:30.307572672 +0000 UTC m=+349.817291638" Mar 20 13:27:31 crc kubenswrapper[4895]: I0320 13:27:31.303382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerStarted","Data":"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb"} Mar 20 13:27:31 crc kubenswrapper[4895]: I0320 13:27:31.310290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerStarted","Data":"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb"} Mar 20 13:27:31 crc kubenswrapper[4895]: I0320 13:27:31.312475 4895 generic.go:334] "Generic (PLEG): container finished" podID="5e11a404-9171-42bf-83c6-341c3db05c44" containerID="2661b8f4d8d3ffa80216de7b5286549bf561279ed67b5088de2869e7d04aa8a6" exitCode=0 Mar 20 13:27:31 crc kubenswrapper[4895]: I0320 13:27:31.312785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnqrs" event={"ID":"5e11a404-9171-42bf-83c6-341c3db05c44","Type":"ContainerDied","Data":"2661b8f4d8d3ffa80216de7b5286549bf561279ed67b5088de2869e7d04aa8a6"} Mar 20 13:27:31 crc kubenswrapper[4895]: I0320 13:27:31.347787 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mng7h" podStartSLOduration=2.852096227 podStartE2EDuration="5.347770587s" podCreationTimestamp="2026-03-20 13:27:26 +0000 UTC" firstStartedPulling="2026-03-20 13:27:28.221372402 +0000 UTC m=+347.731091368" lastFinishedPulling="2026-03-20 13:27:30.717046742 +0000 UTC m=+350.226765728" observedRunningTime="2026-03-20 13:27:31.345315602 +0000 UTC m=+350.855034588" watchObservedRunningTime="2026-03-20 13:27:31.347770587 +0000 UTC m=+350.857489553" Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.320484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xnqrs" event={"ID":"5e11a404-9171-42bf-83c6-341c3db05c44","Type":"ContainerStarted","Data":"f9863e0c9f54e480a6e9e677bc6e9b5199271695cb64f5f73fa008e20332b192"} Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.323282 4895 generic.go:334] "Generic (PLEG): container finished" podID="539ac798-b3af-410c-9e5e-4e45263b692b" containerID="509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb" exitCode=0 Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.323354 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerDied","Data":"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb"} Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.323421 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerStarted","Data":"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8"} Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.338476 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xnqrs" podStartSLOduration=1.87060569 podStartE2EDuration="3.338454004s" podCreationTimestamp="2026-03-20 13:27:29 +0000 UTC" firstStartedPulling="2026-03-20 13:27:30.284216816 +0000 UTC m=+349.793935782" lastFinishedPulling="2026-03-20 13:27:31.75206513 +0000 UTC m=+351.261784096" observedRunningTime="2026-03-20 13:27:32.337134254 +0000 UTC m=+351.846853240" watchObservedRunningTime="2026-03-20 13:27:32.338454004 +0000 UTC m=+351.848172980" Mar 20 13:27:32 crc kubenswrapper[4895]: I0320 13:27:32.360828 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rx9gl" podStartSLOduration=2.935238928 podStartE2EDuration="4.360810339s" podCreationTimestamp="2026-03-20 13:27:28 +0000 UTC" firstStartedPulling="2026-03-20 13:27:30.289508074 +0000 UTC m=+349.799227040" lastFinishedPulling="2026-03-20 13:27:31.715079485 +0000 UTC m=+351.224798451" observedRunningTime="2026-03-20 13:27:32.358110498 +0000 UTC m=+351.867829474" watchObservedRunningTime="2026-03-20 13:27:32.360810339 +0000 UTC m=+351.870529305" Mar 20 13:27:36 crc kubenswrapper[4895]: I0320 13:27:36.412675 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:36 crc kubenswrapper[4895]: I0320 13:27:36.413302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:37 crc kubenswrapper[4895]: I0320 13:27:37.044502 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:37 crc kubenswrapper[4895]: I0320 13:27:37.044888 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:37 crc kubenswrapper[4895]: I0320 13:27:37.103154 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:37 crc kubenswrapper[4895]: I0320 13:27:37.415812 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:27:37 crc kubenswrapper[4895]: I0320 13:27:37.491877 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jk92b" podUID="602120eb-39e8-4b29-a2f9-7ff1fe5a0222" containerName="registry-server" probeResult="failure" output=< Mar 20 13:27:37 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:27:37 crc kubenswrapper[4895]: > Mar 20 13:27:38 crc kubenswrapper[4895]: I0320 13:27:38.964572 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:38 crc kubenswrapper[4895]: I0320 13:27:38.964636 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:39 crc kubenswrapper[4895]: I0320 13:27:39.010352 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:39 crc kubenswrapper[4895]: I0320 13:27:39.414622 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 13:27:39 crc kubenswrapper[4895]: I0320 13:27:39.675998 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:39 crc kubenswrapper[4895]: I0320 13:27:39.676051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:39 crc kubenswrapper[4895]: I0320 13:27:39.739090 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:40 crc kubenswrapper[4895]: I0320 13:27:40.409474 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xnqrs" Mar 20 13:27:46 crc kubenswrapper[4895]: I0320 13:27:46.460630 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:46 crc kubenswrapper[4895]: I0320 13:27:46.498435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jk92b" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.216130 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" podUID="e2338d00-a33d-4b4d-8686-064b95e39943" containerName="registry" containerID="cri-o://f9f94b55ba776816bda0ce898bdea565e4843d7596c715a485eaba0b3491be16" gracePeriod=30 Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.447281 4895 generic.go:334] "Generic (PLEG): container finished" podID="e2338d00-a33d-4b4d-8686-064b95e39943" containerID="f9f94b55ba776816bda0ce898bdea565e4843d7596c715a485eaba0b3491be16" exitCode=0 Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.447594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" event={"ID":"e2338d00-a33d-4b4d-8686-064b95e39943","Type":"ContainerDied","Data":"f9f94b55ba776816bda0ce898bdea565e4843d7596c715a485eaba0b3491be16"} Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.580414 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667742 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667800 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667946 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvt26\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667971 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.667994 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.668177 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2338d00-a33d-4b4d-8686-064b95e39943\" (UID: \"e2338d00-a33d-4b4d-8686-064b95e39943\") " Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.668776 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.670224 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.673653 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.674148 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.675326 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26" (OuterVolumeSpecName: "kube-api-access-kvt26") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "kube-api-access-kvt26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.684958 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.685413 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.685680 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2338d00-a33d-4b4d-8686-064b95e39943" (UID: "e2338d00-a33d-4b4d-8686-064b95e39943"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.768993 4895 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2338d00-a33d-4b4d-8686-064b95e39943-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769036 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769045 4895 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2338d00-a33d-4b4d-8686-064b95e39943-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769056 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvt26\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-kube-api-access-kvt26\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769065 4895 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2338d00-a33d-4b4d-8686-064b95e39943-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769074 4895 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:51 crc kubenswrapper[4895]: I0320 13:27:51.769082 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2338d00-a33d-4b4d-8686-064b95e39943-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:52 crc kubenswrapper[4895]: I0320 13:27:52.453744 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" event={"ID":"e2338d00-a33d-4b4d-8686-064b95e39943","Type":"ContainerDied","Data":"cf2b2936abf23c991c018d02c09b0fdc527707b4bb7aeb58701cda06ea9c463d"} Mar 20 13:27:52 crc kubenswrapper[4895]: I0320 13:27:52.453941 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2tqkj" Mar 20 13:27:52 crc kubenswrapper[4895]: I0320 13:27:52.453974 4895 scope.go:117] "RemoveContainer" containerID="f9f94b55ba776816bda0ce898bdea565e4843d7596c715a485eaba0b3491be16" Mar 20 13:27:52 crc kubenswrapper[4895]: I0320 13:27:52.485398 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:27:52 crc kubenswrapper[4895]: I0320 13:27:52.489116 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2tqkj"] Mar 20 13:27:53 crc kubenswrapper[4895]: I0320 13:27:53.227110 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2338d00-a33d-4b4d-8686-064b95e39943" path="/var/lib/kubelet/pods/e2338d00-a33d-4b4d-8686-064b95e39943/volumes" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.134469 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566888-vt8mx"] Mar 20 13:28:00 crc kubenswrapper[4895]: E0320 13:28:00.135150 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2338d00-a33d-4b4d-8686-064b95e39943" containerName="registry" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.135162 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2338d00-a33d-4b4d-8686-064b95e39943" containerName="registry" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.135253 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2338d00-a33d-4b4d-8686-064b95e39943" containerName="registry" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.135604 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.141172 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.141291 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.141410 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.150817 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-vt8mx"] Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.177676 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88x9\" (UniqueName: \"kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9\") pod \"auto-csr-approver-29566888-vt8mx\" (UID: \"c83f6aaf-7a04-4611-8654-826c470c1f94\") " pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.278656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88x9\" (UniqueName: \"kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9\") pod \"auto-csr-approver-29566888-vt8mx\" (UID: \"c83f6aaf-7a04-4611-8654-826c470c1f94\") " pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.303668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88x9\" (UniqueName: \"kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9\") pod \"auto-csr-approver-29566888-vt8mx\" (UID: \"c83f6aaf-7a04-4611-8654-826c470c1f94\") " pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.454308 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:00 crc kubenswrapper[4895]: I0320 13:28:00.865846 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-vt8mx"] Mar 20 13:28:01 crc kubenswrapper[4895]: I0320 13:28:01.519912 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" event={"ID":"c83f6aaf-7a04-4611-8654-826c470c1f94","Type":"ContainerStarted","Data":"7ced2d14e86decce9bebd8a6d46fb9fc5712db34673dcb4ee69f4ee162b3424e"} Mar 20 13:28:02 crc kubenswrapper[4895]: I0320 13:28:02.529329 4895 generic.go:334] "Generic (PLEG): container finished" podID="c83f6aaf-7a04-4611-8654-826c470c1f94" containerID="d95b253296ff25bab80900db298daca34b7737467616fe6c1f617c22adedcacf" exitCode=0 Mar 20 13:28:02 crc kubenswrapper[4895]: I0320 13:28:02.529452 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" event={"ID":"c83f6aaf-7a04-4611-8654-826c470c1f94","Type":"ContainerDied","Data":"d95b253296ff25bab80900db298daca34b7737467616fe6c1f617c22adedcacf"} Mar 20 13:28:03 crc kubenswrapper[4895]: I0320 13:28:03.733968 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:03 crc kubenswrapper[4895]: I0320 13:28:03.929594 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88x9\" (UniqueName: \"kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9\") pod \"c83f6aaf-7a04-4611-8654-826c470c1f94\" (UID: \"c83f6aaf-7a04-4611-8654-826c470c1f94\") " Mar 20 13:28:03 crc kubenswrapper[4895]: I0320 13:28:03.940592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9" (OuterVolumeSpecName: "kube-api-access-w88x9") pod "c83f6aaf-7a04-4611-8654-826c470c1f94" (UID: "c83f6aaf-7a04-4611-8654-826c470c1f94"). InnerVolumeSpecName "kube-api-access-w88x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:04 crc kubenswrapper[4895]: I0320 13:28:04.031154 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88x9\" (UniqueName: \"kubernetes.io/projected/c83f6aaf-7a04-4611-8654-826c470c1f94-kube-api-access-w88x9\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:04 crc kubenswrapper[4895]: I0320 13:28:04.541725 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" event={"ID":"c83f6aaf-7a04-4611-8654-826c470c1f94","Type":"ContainerDied","Data":"7ced2d14e86decce9bebd8a6d46fb9fc5712db34673dcb4ee69f4ee162b3424e"} Mar 20 13:28:04 crc kubenswrapper[4895]: I0320 13:28:04.541760 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ced2d14e86decce9bebd8a6d46fb9fc5712db34673dcb4ee69f4ee162b3424e" Mar 20 13:28:04 crc kubenswrapper[4895]: I0320 13:28:04.541795 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-vt8mx" Mar 20 13:28:22 crc kubenswrapper[4895]: I0320 13:28:22.296874 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:22 crc kubenswrapper[4895]: I0320 13:28:22.297567 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:28:52 crc kubenswrapper[4895]: I0320 13:28:52.296885 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:52 crc kubenswrapper[4895]: I0320 13:28:52.297694 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:29:22 crc kubenswrapper[4895]: I0320 13:29:22.297341 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:29:22 crc kubenswrapper[4895]: I0320 13:29:22.298094 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:29:22 crc kubenswrapper[4895]: I0320 13:29:22.298164 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:29:22 crc kubenswrapper[4895]: I0320 13:29:22.299113 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:29:22 crc kubenswrapper[4895]: I0320 13:29:22.299211 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2" gracePeriod=600 Mar 20 13:29:23 crc kubenswrapper[4895]: I0320 13:29:23.045987 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2" exitCode=0 Mar 20 13:29:23 crc kubenswrapper[4895]: I0320 13:29:23.046123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2"} Mar 20 13:29:23 crc kubenswrapper[4895]: I0320 13:29:23.046521 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee"} Mar 20 13:29:23 crc kubenswrapper[4895]: I0320 13:29:23.046555 4895 scope.go:117] "RemoveContainer" containerID="1f2d59fbd005b74d7a6c7427897622aec8a24aea24892ff69785a543394f4efa" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.147566 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566890-xm9vg"] Mar 20 13:30:00 crc kubenswrapper[4895]: E0320 13:30:00.148266 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83f6aaf-7a04-4611-8654-826c470c1f94" containerName="oc" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.148279 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83f6aaf-7a04-4611-8654-826c470c1f94" containerName="oc" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.148373 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83f6aaf-7a04-4611-8654-826c470c1f94" containerName="oc" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.148772 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.151942 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.152499 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.152531 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.153327 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl"] Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.154071 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.156533 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.156529 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.158715 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl"] Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.163072 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-xm9vg"] Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.276598 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvqq\" (UniqueName: \"kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq\") pod \"auto-csr-approver-29566890-xm9vg\" (UID: \"6ecd2503-6e65-462b-ada9-4fb5ede84f14\") " pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.276766 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.276825 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.277052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpxv\" (UniqueName: \"kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.378024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpxv\" (UniqueName: \"kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.378143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvqq\" (UniqueName: \"kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq\") pod \"auto-csr-approver-29566890-xm9vg\" (UID: \"6ecd2503-6e65-462b-ada9-4fb5ede84f14\") " pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.378250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.378314 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.379369 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.387415 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.396515 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpxv\" (UniqueName: \"kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv\") pod \"collect-profiles-29566890-nkfwl\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.405512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvqq\" (UniqueName: \"kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq\") pod \"auto-csr-approver-29566890-xm9vg\" (UID: \"6ecd2503-6e65-462b-ada9-4fb5ede84f14\") " pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.469345 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.478635 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.693280 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl"] Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.737346 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-xm9vg"] Mar 20 13:30:00 crc kubenswrapper[4895]: W0320 13:30:00.740007 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ecd2503_6e65_462b_ada9_4fb5ede84f14.slice/crio-b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a WatchSource:0}: Error finding container b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a: Status 404 returned error can't find the container with id b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a Mar 20 13:30:00 crc kubenswrapper[4895]: I0320 13:30:00.743736 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:30:01 crc kubenswrapper[4895]: I0320 13:30:01.293326 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" event={"ID":"6ecd2503-6e65-462b-ada9-4fb5ede84f14","Type":"ContainerStarted","Data":"b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a"} Mar 20 13:30:01 crc kubenswrapper[4895]: I0320 13:30:01.294897 4895 generic.go:334] "Generic (PLEG): container finished" podID="8fc54edb-e159-48f3-8c25-cc714a7ab3a5" containerID="149b13089bd0115b836c1af3981aea33aaa7ac022ee7bdde63733d22ca27509f" exitCode=0 Mar 20 13:30:01 crc kubenswrapper[4895]: I0320 13:30:01.294937 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" event={"ID":"8fc54edb-e159-48f3-8c25-cc714a7ab3a5","Type":"ContainerDied","Data":"149b13089bd0115b836c1af3981aea33aaa7ac022ee7bdde63733d22ca27509f"} Mar 20 13:30:01 crc kubenswrapper[4895]: I0320 13:30:01.294973 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" event={"ID":"8fc54edb-e159-48f3-8c25-cc714a7ab3a5","Type":"ContainerStarted","Data":"3ca0f2640aaced2a2122809a99ae32af283ad7625ba0d5694cc144897102a6ee"} Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.549993 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.706272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume\") pod \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.706376 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cpxv\" (UniqueName: \"kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv\") pod \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.706446 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume\") pod \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\" (UID: \"8fc54edb-e159-48f3-8c25-cc714a7ab3a5\") " Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.707192 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "8fc54edb-e159-48f3-8c25-cc714a7ab3a5" (UID: "8fc54edb-e159-48f3-8c25-cc714a7ab3a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.712212 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv" (OuterVolumeSpecName: "kube-api-access-8cpxv") pod "8fc54edb-e159-48f3-8c25-cc714a7ab3a5" (UID: "8fc54edb-e159-48f3-8c25-cc714a7ab3a5"). InnerVolumeSpecName "kube-api-access-8cpxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.720619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8fc54edb-e159-48f3-8c25-cc714a7ab3a5" (UID: "8fc54edb-e159-48f3-8c25-cc714a7ab3a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.808815 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cpxv\" (UniqueName: \"kubernetes.io/projected/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-kube-api-access-8cpxv\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.808889 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:02 crc kubenswrapper[4895]: I0320 13:30:02.808916 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fc54edb-e159-48f3-8c25-cc714a7ab3a5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4895]: I0320 13:30:03.309024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" event={"ID":"8fc54edb-e159-48f3-8c25-cc714a7ab3a5","Type":"ContainerDied","Data":"3ca0f2640aaced2a2122809a99ae32af283ad7625ba0d5694cc144897102a6ee"} Mar 20 13:30:03 crc kubenswrapper[4895]: I0320 13:30:03.309077 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca0f2640aaced2a2122809a99ae32af283ad7625ba0d5694cc144897102a6ee" Mar 20 13:30:03 crc kubenswrapper[4895]: I0320 13:30:03.309043 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl" Mar 20 13:30:03 crc kubenswrapper[4895]: I0320 13:30:03.311408 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ecd2503-6e65-462b-ada9-4fb5ede84f14" containerID="e1def2f44dc457d1e9e6201a77041537be4ee6287e9b19c32ce6378f7cecf2ab" exitCode=0 Mar 20 13:30:03 crc kubenswrapper[4895]: I0320 13:30:03.311464 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" event={"ID":"6ecd2503-6e65-462b-ada9-4fb5ede84f14","Type":"ContainerDied","Data":"e1def2f44dc457d1e9e6201a77041537be4ee6287e9b19c32ce6378f7cecf2ab"} Mar 20 13:30:04 crc kubenswrapper[4895]: I0320 13:30:04.571195 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:04 crc kubenswrapper[4895]: I0320 13:30:04.732435 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfvqq\" (UniqueName: \"kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq\") pod \"6ecd2503-6e65-462b-ada9-4fb5ede84f14\" (UID: \"6ecd2503-6e65-462b-ada9-4fb5ede84f14\") " Mar 20 13:30:04 crc kubenswrapper[4895]: I0320 13:30:04.736650 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq" (OuterVolumeSpecName: "kube-api-access-hfvqq") pod "6ecd2503-6e65-462b-ada9-4fb5ede84f14" (UID: "6ecd2503-6e65-462b-ada9-4fb5ede84f14"). InnerVolumeSpecName "kube-api-access-hfvqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:04 crc kubenswrapper[4895]: I0320 13:30:04.834485 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfvqq\" (UniqueName: \"kubernetes.io/projected/6ecd2503-6e65-462b-ada9-4fb5ede84f14-kube-api-access-hfvqq\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:05 crc kubenswrapper[4895]: I0320 13:30:05.329563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" event={"ID":"6ecd2503-6e65-462b-ada9-4fb5ede84f14","Type":"ContainerDied","Data":"b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a"} Mar 20 13:30:05 crc kubenswrapper[4895]: I0320 13:30:05.329616 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-xm9vg" Mar 20 13:30:05 crc kubenswrapper[4895]: I0320 13:30:05.329627 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b84525eac992a2baf0da7080db5a9523e1bbd6fb0504c138033b9c99f2bf7a0a" Mar 20 13:30:05 crc kubenswrapper[4895]: I0320 13:30:05.636942 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-dxd7s"] Mar 20 13:30:05 crc kubenswrapper[4895]: I0320 13:30:05.640078 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-dxd7s"] Mar 20 13:30:07 crc kubenswrapper[4895]: I0320 13:30:07.219860 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="961b2d9b-3350-4f85-98af-412ea452ae83" path="/var/lib/kubelet/pods/961b2d9b-3350-4f85-98af-412ea452ae83/volumes" Mar 20 13:30:41 crc kubenswrapper[4895]: I0320 13:30:41.677882 4895 scope.go:117] "RemoveContainer" containerID="feba80ca1f4ff09ebaff062bc0c43d8104d95c72ba532a89192c2e95cef8f601" Mar 20 13:30:41 crc kubenswrapper[4895]: I0320 13:30:41.702083 4895 scope.go:117] "RemoveContainer" containerID="0d513000bd226a761adeeec48f2a23b06935898f09b024db232fc20f5127f7b1" Mar 20 13:31:22 crc kubenswrapper[4895]: I0320 13:31:22.296937 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:22 crc kubenswrapper[4895]: I0320 13:31:22.297999 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:31:41 crc kubenswrapper[4895]: I0320 13:31:41.773796 4895 scope.go:117] "RemoveContainer" containerID="4de377c8452ffdd9d09283143ae5a2a875a8355076050728efeff023517a3823" Mar 20 13:31:41 crc kubenswrapper[4895]: I0320 13:31:41.797673 4895 scope.go:117] "RemoveContainer" containerID="62c5b718a4a75f74c07414fec26fcd3038aae43c11a723774889806506c8cbbc" Mar 20 13:31:52 crc kubenswrapper[4895]: I0320 13:31:52.296911 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:52 crc kubenswrapper[4895]: I0320 13:31:52.297695 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.147462 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-vd7lc"] Mar 20 13:32:00 crc kubenswrapper[4895]: E0320 13:32:00.148581 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecd2503-6e65-462b-ada9-4fb5ede84f14" containerName="oc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.148610 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecd2503-6e65-462b-ada9-4fb5ede84f14" containerName="oc" Mar 20 13:32:00 crc kubenswrapper[4895]: E0320 13:32:00.148657 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc54edb-e159-48f3-8c25-cc714a7ab3a5" containerName="collect-profiles" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.148673 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc54edb-e159-48f3-8c25-cc714a7ab3a5" containerName="collect-profiles" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.148893 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecd2503-6e65-462b-ada9-4fb5ede84f14" containerName="oc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.148933 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc54edb-e159-48f3-8c25-cc714a7ab3a5" containerName="collect-profiles" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.149686 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.153608 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.155155 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.156068 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.156742 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-vd7lc"] Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.348134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbbm\" (UniqueName: \"kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm\") pod \"auto-csr-approver-29566892-vd7lc\" (UID: \"9c0601c4-bbf5-49e4-bdc8-bd482c79f041\") " pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.449867 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bbbm\" (UniqueName: \"kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm\") pod \"auto-csr-approver-29566892-vd7lc\" (UID: \"9c0601c4-bbf5-49e4-bdc8-bd482c79f041\") " pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.479600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bbbm\" (UniqueName: \"kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm\") pod \"auto-csr-approver-29566892-vd7lc\" (UID: \"9c0601c4-bbf5-49e4-bdc8-bd482c79f041\") " pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.481788 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:00 crc kubenswrapper[4895]: I0320 13:32:00.671955 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-vd7lc"] Mar 20 13:32:01 crc kubenswrapper[4895]: I0320 13:32:01.082714 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" event={"ID":"9c0601c4-bbf5-49e4-bdc8-bd482c79f041","Type":"ContainerStarted","Data":"166eb4d74d23dceff91c79a55a1d16853e9d0680304f3dbf90e28d7765744e2d"} Mar 20 13:32:03 crc kubenswrapper[4895]: I0320 13:32:03.096691 4895 generic.go:334] "Generic (PLEG): container finished" podID="9c0601c4-bbf5-49e4-bdc8-bd482c79f041" containerID="240f0bddc8821c2a2acebda427b3d73cee92643ee86ba86a641ecfa039f24f8d" exitCode=0 Mar 20 13:32:03 crc kubenswrapper[4895]: I0320 13:32:03.097089 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" event={"ID":"9c0601c4-bbf5-49e4-bdc8-bd482c79f041","Type":"ContainerDied","Data":"240f0bddc8821c2a2acebda427b3d73cee92643ee86ba86a641ecfa039f24f8d"} Mar 20 13:32:04 crc kubenswrapper[4895]: I0320 13:32:04.325545 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:04 crc kubenswrapper[4895]: I0320 13:32:04.502384 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bbbm\" (UniqueName: \"kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm\") pod \"9c0601c4-bbf5-49e4-bdc8-bd482c79f041\" (UID: \"9c0601c4-bbf5-49e4-bdc8-bd482c79f041\") " Mar 20 13:32:04 crc kubenswrapper[4895]: I0320 13:32:04.508337 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm" (OuterVolumeSpecName: "kube-api-access-2bbbm") pod "9c0601c4-bbf5-49e4-bdc8-bd482c79f041" (UID: "9c0601c4-bbf5-49e4-bdc8-bd482c79f041"). InnerVolumeSpecName "kube-api-access-2bbbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:04 crc kubenswrapper[4895]: I0320 13:32:04.604655 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bbbm\" (UniqueName: \"kubernetes.io/projected/9c0601c4-bbf5-49e4-bdc8-bd482c79f041-kube-api-access-2bbbm\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4895]: I0320 13:32:05.112796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" event={"ID":"9c0601c4-bbf5-49e4-bdc8-bd482c79f041","Type":"ContainerDied","Data":"166eb4d74d23dceff91c79a55a1d16853e9d0680304f3dbf90e28d7765744e2d"} Mar 20 13:32:05 crc kubenswrapper[4895]: I0320 13:32:05.112836 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="166eb4d74d23dceff91c79a55a1d16853e9d0680304f3dbf90e28d7765744e2d" Mar 20 13:32:05 crc kubenswrapper[4895]: I0320 13:32:05.112867 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-vd7lc" Mar 20 13:32:05 crc kubenswrapper[4895]: I0320 13:32:05.403430 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-bd6nz"] Mar 20 13:32:05 crc kubenswrapper[4895]: I0320 13:32:05.411007 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-bd6nz"] Mar 20 13:32:07 crc kubenswrapper[4895]: I0320 13:32:07.222835 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118ecc82-ea27-4675-8206-aa7457215e80" path="/var/lib/kubelet/pods/118ecc82-ea27-4675-8206-aa7457215e80/volumes" Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.299344 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.300615 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.300686 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.301355 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.301454 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee" gracePeriod=600 Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.765367 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee" exitCode=0 Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.765475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee"} Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.765911 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85"} Mar 20 13:32:22 crc kubenswrapper[4895]: I0320 13:32:22.765965 4895 scope.go:117] "RemoveContainer" containerID="d5a35b5f016a2264b4d04aa2948592a784f0738ac4f324378965533e2dae36d2" Mar 20 13:32:41 crc kubenswrapper[4895]: I0320 13:32:41.839259 4895 scope.go:117] "RemoveContainer" containerID="916b91cf254b8986c424481ee31a145cc5d343063629868bef5c7ca71df1fdfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.625966 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc"] Mar 20 13:32:51 crc kubenswrapper[4895]: E0320 13:32:51.626637 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0601c4-bbf5-49e4-bdc8-bd482c79f041" containerName="oc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.626652 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0601c4-bbf5-49e4-bdc8-bd482c79f041" containerName="oc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.626771 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0601c4-bbf5-49e4-bdc8-bd482c79f041" containerName="oc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.627631 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.629267 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.637874 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc"] Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.717894 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkj44\" (UniqueName: \"kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.717971 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.718043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.819490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.819616 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkj44\" (UniqueName: \"kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.819678 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.820034 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.820713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.843703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkj44\" (UniqueName: \"kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:51 crc kubenswrapper[4895]: I0320 13:32:51.951586 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:52 crc kubenswrapper[4895]: I0320 13:32:52.147789 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc"] Mar 20 13:32:52 crc kubenswrapper[4895]: I0320 13:32:52.959272 4895 generic.go:334] "Generic (PLEG): container finished" podID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerID="6ed3a6c8f1a48c52583cf1404b4c9a56b49003f9fb75fdab694200883563637b" exitCode=0 Mar 20 13:32:52 crc kubenswrapper[4895]: I0320 13:32:52.959363 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" event={"ID":"16d1df97-cdad-4acd-8aa4-66383ab645cc","Type":"ContainerDied","Data":"6ed3a6c8f1a48c52583cf1404b4c9a56b49003f9fb75fdab694200883563637b"} Mar 20 13:32:52 crc kubenswrapper[4895]: I0320 13:32:52.959611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" event={"ID":"16d1df97-cdad-4acd-8aa4-66383ab645cc","Type":"ContainerStarted","Data":"68b511ad24facc17e595dc6f432d8eb762efb4d88641aec04e6b878062134551"} Mar 20 13:32:54 crc kubenswrapper[4895]: I0320 13:32:54.975430 4895 generic.go:334] "Generic (PLEG): container finished" podID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerID="06f48197b3588ccea390fa01a466bfe53950125e32436fdf1c779bbc510a1f79" exitCode=0 Mar 20 13:32:54 crc kubenswrapper[4895]: I0320 13:32:54.975546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" event={"ID":"16d1df97-cdad-4acd-8aa4-66383ab645cc","Type":"ContainerDied","Data":"06f48197b3588ccea390fa01a466bfe53950125e32436fdf1c779bbc510a1f79"} Mar 20 13:32:55 crc kubenswrapper[4895]: I0320 13:32:55.992761 4895 generic.go:334] "Generic (PLEG): container finished" podID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerID="4c2152ccd3257929cdcb8f3248f4a55d032511e38ffae9d36cc0f4c1f560ef71" exitCode=0 Mar 20 13:32:55 crc kubenswrapper[4895]: I0320 13:32:55.992817 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" event={"ID":"16d1df97-cdad-4acd-8aa4-66383ab645cc","Type":"ContainerDied","Data":"4c2152ccd3257929cdcb8f3248f4a55d032511e38ffae9d36cc0f4c1f560ef71"} Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.269909 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.397488 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util\") pod \"16d1df97-cdad-4acd-8aa4-66383ab645cc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.397574 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkj44\" (UniqueName: \"kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44\") pod \"16d1df97-cdad-4acd-8aa4-66383ab645cc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.397678 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle\") pod \"16d1df97-cdad-4acd-8aa4-66383ab645cc\" (UID: \"16d1df97-cdad-4acd-8aa4-66383ab645cc\") " Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.401325 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle" (OuterVolumeSpecName: "bundle") pod "16d1df97-cdad-4acd-8aa4-66383ab645cc" (UID: "16d1df97-cdad-4acd-8aa4-66383ab645cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.403246 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44" (OuterVolumeSpecName: "kube-api-access-jkj44") pod "16d1df97-cdad-4acd-8aa4-66383ab645cc" (UID: "16d1df97-cdad-4acd-8aa4-66383ab645cc"). InnerVolumeSpecName "kube-api-access-jkj44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.419246 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util" (OuterVolumeSpecName: "util") pod "16d1df97-cdad-4acd-8aa4-66383ab645cc" (UID: "16d1df97-cdad-4acd-8aa4-66383ab645cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.500184 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.500241 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkj44\" (UniqueName: \"kubernetes.io/projected/16d1df97-cdad-4acd-8aa4-66383ab645cc-kube-api-access-jkj44\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:57 crc kubenswrapper[4895]: I0320 13:32:57.500270 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16d1df97-cdad-4acd-8aa4-66383ab645cc-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:58 crc kubenswrapper[4895]: I0320 13:32:58.007807 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" event={"ID":"16d1df97-cdad-4acd-8aa4-66383ab645cc","Type":"ContainerDied","Data":"68b511ad24facc17e595dc6f432d8eb762efb4d88641aec04e6b878062134551"} Mar 20 13:32:58 crc kubenswrapper[4895]: I0320 13:32:58.007866 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc" Mar 20 13:32:58 crc kubenswrapper[4895]: I0320 13:32:58.007877 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b511ad24facc17e595dc6f432d8eb762efb4d88641aec04e6b878062134551" Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.946413 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v6kxx"] Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948131 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-controller" containerID="cri-o://5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948258 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="nbdb" containerID="cri-o://e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948312 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948382 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-acl-logging" containerID="cri-o://3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948413 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-node" containerID="cri-o://3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948381 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="sbdb" containerID="cri-o://bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.948444 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="northd" containerID="cri-o://1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" gracePeriod=30 Mar 20 13:33:02 crc kubenswrapper[4895]: I0320 13:33:02.986564 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovnkube-controller" containerID="cri-o://377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" gracePeriod=30 Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.037682 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5xtn2_cab32ac6-a22f-4e11-9eaf-4c50ffbce748/kube-multus/0.log" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.037752 4895 generic.go:334] "Generic (PLEG): container finished" podID="cab32ac6-a22f-4e11-9eaf-4c50ffbce748" containerID="c09f78e64ccb74ba75fb916374c78e76f93d74bac69d22c44e8e7d3ecb5ade2e" exitCode=2 Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.037796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5xtn2" event={"ID":"cab32ac6-a22f-4e11-9eaf-4c50ffbce748","Type":"ContainerDied","Data":"c09f78e64ccb74ba75fb916374c78e76f93d74bac69d22c44e8e7d3ecb5ade2e"} Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.038426 4895 scope.go:117] "RemoveContainer" containerID="c09f78e64ccb74ba75fb916374c78e76f93d74bac69d22c44e8e7d3ecb5ade2e" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.300553 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v6kxx_3b961aee-5ade-4c44-af26-349f5a34a3d2/ovn-acl-logging/0.log" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.301227 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v6kxx_3b961aee-5ade-4c44-af26-349f5a34a3d2/ovn-controller/0.log" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.301695 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.358839 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f6s5l"] Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359064 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="extract" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359082 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="extract" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359094 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="sbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359101 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="sbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359107 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="util" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359114 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="util" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359124 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovnkube-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359131 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovnkube-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359140 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-acl-logging" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359146 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-acl-logging" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359154 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="pull" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359159 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="pull" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359168 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kubecfg-setup" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359174 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kubecfg-setup" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359182 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359188 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359194 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="nbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359200 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="nbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359210 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-node" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359216 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-node" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359225 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359230 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: E0320 13:33:03.359237 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="northd" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359242 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="northd" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359326 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-acl-logging" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359338 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d1df97-cdad-4acd-8aa4-66383ab645cc" containerName="extract" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359345 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovnkube-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359353 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="nbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359361 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359369 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="sbdb" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359378 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="ovn-controller" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359404 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="northd" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.359415 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerName="kube-rbac-proxy-node" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.361224 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373530 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373585 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373608 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373607 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373631 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373666 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373701 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373723 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373741 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373747 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373760 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373764 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn6xf\" (UniqueName: \"kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373812 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373856 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373873 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373910 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373933 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373954 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.373979 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert\") pod \"3b961aee-5ade-4c44-af26-349f5a34a3d2\" (UID: \"3b961aee-5ade-4c44-af26-349f5a34a3d2\") " Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374195 4895 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374208 4895 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374217 4895 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374228 4895 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374294 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374326 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374422 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374477 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374498 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket" (OuterVolumeSpecName: "log-socket") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374517 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash" (OuterVolumeSpecName: "host-slash") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374535 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log" (OuterVolumeSpecName: "node-log") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374554 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374574 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374669 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374704 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.374847 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.375262 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.381462 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.382162 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf" (OuterVolumeSpecName: "kube-api-access-pn6xf") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "kube-api-access-pn6xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.398914 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3b961aee-5ade-4c44-af26-349f5a34a3d2" (UID: "3b961aee-5ade-4c44-af26-349f5a34a3d2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-systemd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475526 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-config\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475552 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-etc-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475575 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-log-socket\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475606 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-netd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-env-overrides\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475649 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a661681-cdbd-4833-81d7-6b31b8f99763-ovn-node-metrics-cert\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475671 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-netns\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475732 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-slash\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475758 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-var-lib-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475781 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-kubelet\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-script-lib\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.475975 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrlp\" (UniqueName: \"kubernetes.io/projected/1a661681-cdbd-4833-81d7-6b31b8f99763-kube-api-access-vfrlp\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-node-log\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476046 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-systemd-units\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-ovn\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476107 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476159 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-bin\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476203 4895 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476214 4895 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476223 4895 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476232 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476241 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn6xf\" (UniqueName: \"kubernetes.io/projected/3b961aee-5ade-4c44-af26-349f5a34a3d2-kube-api-access-pn6xf\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476250 4895 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476259 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476268 4895 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476278 4895 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476291 4895 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476301 4895 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476312 4895 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476323 4895 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476335 4895 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b961aee-5ade-4c44-af26-349f5a34a3d2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476377 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.476389 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b961aee-5ade-4c44-af26-349f5a34a3d2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-netns\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577436 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-slash\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577472 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-var-lib-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577511 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577530 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-netns\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577545 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-slash\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577548 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-kubelet\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577674 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577676 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-var-lib-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577709 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-kubelet\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-script-lib\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrlp\" (UniqueName: \"kubernetes.io/projected/1a661681-cdbd-4833-81d7-6b31b8f99763-kube-api-access-vfrlp\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577814 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-node-log\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577854 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-systemd-units\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577873 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-node-log\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-ovn\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577903 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-systemd-units\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577918 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-ovn\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577973 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.577957 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-bin\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578053 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-systemd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578074 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-bin\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578081 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-config\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578097 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-run-systemd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-etc-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578139 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-log-socket\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578167 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-etc-openvswitch\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578176 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-netd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578212 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-host-cni-netd\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578229 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-env-overrides\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578279 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a661681-cdbd-4833-81d7-6b31b8f99763-ovn-node-metrics-cert\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578662 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-config\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578229 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a661681-cdbd-4833-81d7-6b31b8f99763-log-socket\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.578701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-env-overrides\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.579445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a661681-cdbd-4833-81d7-6b31b8f99763-ovnkube-script-lib\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.582171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a661681-cdbd-4833-81d7-6b31b8f99763-ovn-node-metrics-cert\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.599096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrlp\" (UniqueName: \"kubernetes.io/projected/1a661681-cdbd-4833-81d7-6b31b8f99763-kube-api-access-vfrlp\") pod \"ovnkube-node-f6s5l\" (UID: \"1a661681-cdbd-4833-81d7-6b31b8f99763\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: I0320 13:33:03.692040 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:03 crc kubenswrapper[4895]: W0320 13:33:03.708330 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a661681_cdbd_4833_81d7_6b31b8f99763.slice/crio-0f818682003f1449339d805e4b36bd0eac3b5e40d44597372a673d5b76ede58d WatchSource:0}: Error finding container 0f818682003f1449339d805e4b36bd0eac3b5e40d44597372a673d5b76ede58d: Status 404 returned error can't find the container with id 0f818682003f1449339d805e4b36bd0eac3b5e40d44597372a673d5b76ede58d Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.048335 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v6kxx_3b961aee-5ade-4c44-af26-349f5a34a3d2/ovn-acl-logging/0.log" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049324 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v6kxx_3b961aee-5ade-4c44-af26-349f5a34a3d2/ovn-controller/0.log" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049801 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049829 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049840 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049849 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049860 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049869 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049877 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" exitCode=143 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049886 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b961aee-5ade-4c44-af26-349f5a34a3d2" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" exitCode=143 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049935 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049983 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.049996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050009 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050021 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050035 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050046 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050054 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050064 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050074 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050082 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050089 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050097 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050105 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050111 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050119 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050127 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050134 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050154 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050163 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050170 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050178 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050184 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050191 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050198 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050205 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050212 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050222 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" event={"ID":"3b961aee-5ade-4c44-af26-349f5a34a3d2","Type":"ContainerDied","Data":"939c389635ab19eb220e42525e882c3143bdd62ffee5f02c304fd0b9f3583d8e"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050232 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050240 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050249 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050256 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050263 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050270 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050277 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050285 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050294 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050310 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.050495 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v6kxx" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.056772 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a661681-cdbd-4833-81d7-6b31b8f99763" containerID="13f7f9b7c2c5f383829392820f841ec2ccac6c257f9ae91c13eab5bb7e5b7dbd" exitCode=0 Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.056856 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerDied","Data":"13f7f9b7c2c5f383829392820f841ec2ccac6c257f9ae91c13eab5bb7e5b7dbd"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.056906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"0f818682003f1449339d805e4b36bd0eac3b5e40d44597372a673d5b76ede58d"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.061098 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5xtn2_cab32ac6-a22f-4e11-9eaf-4c50ffbce748/kube-multus/0.log" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.061159 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5xtn2" event={"ID":"cab32ac6-a22f-4e11-9eaf-4c50ffbce748","Type":"ContainerStarted","Data":"2a2944c016cdfb40d5bfb916a7afca395cb190804dfe3f983b9b06e9dd7d2d3e"} Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.127582 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.161901 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.179154 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.200741 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.223548 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.235527 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v6kxx"] Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.256190 4895 scope.go:117] "RemoveContainer" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.258045 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v6kxx"] Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.271122 4895 scope.go:117] "RemoveContainer" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.292885 4895 scope.go:117] "RemoveContainer" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.309136 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.309741 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.309775 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} err="failed to get container status \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.309802 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.310169 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.310210 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} err="failed to get container status \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.310238 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.310651 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.310707 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} err="failed to get container status \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.310737 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.311017 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311047 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} err="failed to get container status \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311067 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.311410 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311441 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} err="failed to get container status \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311470 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.311745 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311774 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} err="failed to get container status \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.311792 4895 scope.go:117] "RemoveContainer" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.312073 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": container with ID starting with 3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745 not found: ID does not exist" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312101 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} err="failed to get container status \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": rpc error: code = NotFound desc = could not find container \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": container with ID starting with 3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312118 4895 scope.go:117] "RemoveContainer" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.312420 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": container with ID starting with 5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55 not found: ID does not exist" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312452 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} err="failed to get container status \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": rpc error: code = NotFound desc = could not find container \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": container with ID starting with 5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312471 4895 scope.go:117] "RemoveContainer" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: E0320 13:33:04.312730 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": container with ID starting with c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7 not found: ID does not exist" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312758 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} err="failed to get container status \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": rpc error: code = NotFound desc = could not find container \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": container with ID starting with c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.312777 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.313073 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} err="failed to get container status \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.313111 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.313899 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} err="failed to get container status \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.313922 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314200 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} err="failed to get container status \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314221 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314525 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} err="failed to get container status \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314544 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314802 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} err="failed to get container status \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.314823 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315133 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} err="failed to get container status \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315162 4895 scope.go:117] "RemoveContainer" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315437 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} err="failed to get container status \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": rpc error: code = NotFound desc = could not find container \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": container with ID starting with 3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315460 4895 scope.go:117] "RemoveContainer" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315721 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} err="failed to get container status \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": rpc error: code = NotFound desc = could not find container \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": container with ID starting with 5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315750 4895 scope.go:117] "RemoveContainer" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315979 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} err="failed to get container status \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": rpc error: code = NotFound desc = could not find container \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": container with ID starting with c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.315998 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316266 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} err="failed to get container status \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316308 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316548 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} err="failed to get container status \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316568 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316815 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} err="failed to get container status \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.316838 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.317063 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} err="failed to get container status \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.317097 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.317376 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} err="failed to get container status \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.317421 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.321560 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} err="failed to get container status \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.321609 4895 scope.go:117] "RemoveContainer" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.322987 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} err="failed to get container status \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": rpc error: code = NotFound desc = could not find container \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": container with ID starting with 3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.323026 4895 scope.go:117] "RemoveContainer" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.323355 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} err="failed to get container status \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": rpc error: code = NotFound desc = could not find container \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": container with ID starting with 5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.323407 4895 scope.go:117] "RemoveContainer" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.323769 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} err="failed to get container status \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": rpc error: code = NotFound desc = could not find container \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": container with ID starting with c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.323800 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324093 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} err="failed to get container status \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324117 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324428 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} err="failed to get container status \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324451 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324818 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} err="failed to get container status \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.324841 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.325152 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} err="failed to get container status \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.325174 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.325584 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} err="failed to get container status \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.325607 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.326210 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} err="failed to get container status \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.326230 4895 scope.go:117] "RemoveContainer" containerID="3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.326702 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745"} err="failed to get container status \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": rpc error: code = NotFound desc = could not find container \"3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745\": container with ID starting with 3ee511a5e883dc374b6f0a7f5aa86b0d3b7997e45fbea467253ce96c43619745 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.326731 4895 scope.go:117] "RemoveContainer" containerID="5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.327072 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55"} err="failed to get container status \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": rpc error: code = NotFound desc = could not find container \"5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55\": container with ID starting with 5e5e0ead670ee114422c1964b9f00e63d2b765e8cd7aea3d2b35f363c83fbb55 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.327089 4895 scope.go:117] "RemoveContainer" containerID="c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.330419 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7"} err="failed to get container status \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": rpc error: code = NotFound desc = could not find container \"c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7\": container with ID starting with c66640fd7d3afde5b1f00538aab4715e82bf245ebe7b15dae9d9cfcbcd67a9f7 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.330453 4895 scope.go:117] "RemoveContainer" containerID="377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.330860 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727"} err="failed to get container status \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": rpc error: code = NotFound desc = could not find container \"377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727\": container with ID starting with 377fdfef9cc0dd8971f3773cc9bb56d02aa3302c1c19b0d06ab0de853f883727 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.330881 4895 scope.go:117] "RemoveContainer" containerID="bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331150 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471"} err="failed to get container status \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": rpc error: code = NotFound desc = could not find container \"bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471\": container with ID starting with bd42d26c994be7eb8d662065c8bbf8abd1d52b6426b1c11ad9f3460bd9ab1471 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331175 4895 scope.go:117] "RemoveContainer" containerID="e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331406 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d"} err="failed to get container status \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": rpc error: code = NotFound desc = could not find container \"e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d\": container with ID starting with e342baa8718ae00e99f6b7de0aa296c69da95154eaf712b4347af62ba1c7bf5d not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331436 4895 scope.go:117] "RemoveContainer" containerID="1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331669 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683"} err="failed to get container status \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": rpc error: code = NotFound desc = could not find container \"1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683\": container with ID starting with 1445379985ce2e396017fdefc7828a99c18d3ec21d92e2df70aaf98d2ff2b683 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331701 4895 scope.go:117] "RemoveContainer" containerID="e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331963 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2"} err="failed to get container status \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": rpc error: code = NotFound desc = could not find container \"e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2\": container with ID starting with e150b7a87a631ecb3c8f98e29953edaded414da396bef2ac3a41cffe6e8703f2 not found: ID does not exist" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.331987 4895 scope.go:117] "RemoveContainer" containerID="3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85" Mar 20 13:33:04 crc kubenswrapper[4895]: I0320 13:33:04.332411 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85"} err="failed to get container status \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": rpc error: code = NotFound desc = could not find container \"3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85\": container with ID starting with 3dccadeee080b43a2bd92060075b187329d532bd3715fe36d50b9c3627e26d85 not found: ID does not exist" Mar 20 13:33:05 crc kubenswrapper[4895]: I0320 13:33:05.073574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"b7b671ee76b050187c2bf0eed7ddaf0498cfe0e6d8ddd9284020c1801e8d530c"} Mar 20 13:33:05 crc kubenswrapper[4895]: I0320 13:33:05.073931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"89a402483c1e4787257d6de272c07c285f2e35e1558fabafc1e30345d18ae093"} Mar 20 13:33:05 crc kubenswrapper[4895]: I0320 13:33:05.073948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"908780e1b3c9e5b2dc3fe0c6824b9d1c7506ae6f00aad6ccf51b94900e197b6e"} Mar 20 13:33:05 crc kubenswrapper[4895]: I0320 13:33:05.073962 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"23034243708f9bc6a0a7ef8bf417b63ae5708fbd71b3efd3331dc616c8a6fb38"} Mar 20 13:33:05 crc kubenswrapper[4895]: I0320 13:33:05.218344 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b961aee-5ade-4c44-af26-349f5a34a3d2" path="/var/lib/kubelet/pods/3b961aee-5ade-4c44-af26-349f5a34a3d2/volumes" Mar 20 13:33:06 crc kubenswrapper[4895]: I0320 13:33:06.084113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"c2962bc0bf28254497a24712e8a319b7e87fcc1bee8a17a1d22501cce0c77ba5"} Mar 20 13:33:06 crc kubenswrapper[4895]: I0320 13:33:06.084178 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"c5e5d6abe4692f8dd92552c44044e2d5acf37760ab09e858ba92c0cbdf585f71"} Mar 20 13:33:08 crc kubenswrapper[4895]: I0320 13:33:08.096239 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"d53c7c99db8bee3bac9432ff9561b992028ee6e6036c862b2636d0687c4393f5"} Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.035145 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-25wpm"] Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.036455 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.039101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-8vjxh" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.039189 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.039890 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.053358 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dztrh\" (UniqueName: \"kubernetes.io/projected/94168b5c-bd4f-4ad1-a35e-6844e0416997-kube-api-access-dztrh\") pod \"obo-prometheus-operator-8ff7d675-25wpm\" (UID: \"94168b5c-bd4f-4ad1-a35e-6844e0416997\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.154622 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dztrh\" (UniqueName: \"kubernetes.io/projected/94168b5c-bd4f-4ad1-a35e-6844e0416997-kube-api-access-dztrh\") pod \"obo-prometheus-operator-8ff7d675-25wpm\" (UID: \"94168b5c-bd4f-4ad1-a35e-6844e0416997\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.173080 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dztrh\" (UniqueName: \"kubernetes.io/projected/94168b5c-bd4f-4ad1-a35e-6844e0416997-kube-api-access-dztrh\") pod \"obo-prometheus-operator-8ff7d675-25wpm\" (UID: \"94168b5c-bd4f-4ad1-a35e-6844e0416997\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.352680 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.363060 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm"] Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.363755 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.365416 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-vqmh9" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.365986 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.378570 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(07857d0f15b120ffd3a65cee84932eec41ad785a1c1982e86a9ed500bbc7de87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.378658 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(07857d0f15b120ffd3a65cee84932eec41ad785a1c1982e86a9ed500bbc7de87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.378694 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(07857d0f15b120ffd3a65cee84932eec41ad785a1c1982e86a9ed500bbc7de87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.378757 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-25wpm_openshift-operators(94168b5c-bd4f-4ad1-a35e-6844e0416997)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-25wpm_openshift-operators(94168b5c-bd4f-4ad1-a35e-6844e0416997)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(07857d0f15b120ffd3a65cee84932eec41ad785a1c1982e86a9ed500bbc7de87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" podUID="94168b5c-bd4f-4ad1-a35e-6844e0416997" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.407643 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd"] Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.408289 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.457608 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.457663 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.457697 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.457763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.558653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.558737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.558782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.558810 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.563059 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.563458 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.563818 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/029af23c-4a48-4160-bfd7-650a11a211dd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm\" (UID: \"029af23c-4a48-4160-bfd7-650a11a211dd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.564225 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e4559a68-4eab-4215-835d-37fc5f2ae439-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd\" (UID: \"e4559a68-4eab-4215-835d-37fc5f2ae439\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.699261 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.722408 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-b4g5j"] Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.723090 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.725100 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.726058 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(59b0dc3a8b8c40d0fa14ca87983b9a85137f30d74152a7685f75f509f397c409): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.726104 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(59b0dc3a8b8c40d0fa14ca87983b9a85137f30d74152a7685f75f509f397c409): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.726127 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(59b0dc3a8b8c40d0fa14ca87983b9a85137f30d74152a7685f75f509f397c409): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.726169 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators(029af23c-4a48-4160-bfd7-650a11a211dd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators(029af23c-4a48-4160-bfd7-650a11a211dd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(59b0dc3a8b8c40d0fa14ca87983b9a85137f30d74152a7685f75f509f397c409): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" podUID="029af23c-4a48-4160-bfd7-650a11a211dd" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.731480 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-b9zml" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.731643 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.759458 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(b9804d7a6fe7046ce8b250e376b2f6860260baf4bc30e331710b9b13504aa4ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.759519 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(b9804d7a6fe7046ce8b250e376b2f6860260baf4bc30e331710b9b13504aa4ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.759542 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(b9804d7a6fe7046ce8b250e376b2f6860260baf4bc30e331710b9b13504aa4ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:10 crc kubenswrapper[4895]: E0320 13:33:10.759580 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators(e4559a68-4eab-4215-835d-37fc5f2ae439)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators(e4559a68-4eab-4215-835d-37fc5f2ae439)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(b9804d7a6fe7046ce8b250e376b2f6860260baf4bc30e331710b9b13504aa4ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" podUID="e4559a68-4eab-4215-835d-37fc5f2ae439" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.761557 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfhv6\" (UniqueName: \"kubernetes.io/projected/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-kube-api-access-xfhv6\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.761620 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.863256 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfhv6\" (UniqueName: \"kubernetes.io/projected/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-kube-api-access-xfhv6\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.863336 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.867415 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:10 crc kubenswrapper[4895]: I0320 13:33:10.909183 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfhv6\" (UniqueName: \"kubernetes.io/projected/875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b-kube-api-access-xfhv6\") pod \"observability-operator-6dd7dd855f-b4g5j\" (UID: \"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b\") " pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.098507 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.118334 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" event={"ID":"1a661681-cdbd-4833-81d7-6b31b8f99763","Type":"ContainerStarted","Data":"b880252b966536bfe35bf0400d8c01a71dd0164621863c8bca80de998cf35bbf"} Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.119257 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.119286 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.119326 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.124502 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(3e7141b20b110a50044045c0f026a336c77f8c89a82beae540214d5abcbd0383): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.124551 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(3e7141b20b110a50044045c0f026a336c77f8c89a82beae540214d5abcbd0383): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.124569 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(3e7141b20b110a50044045c0f026a336c77f8c89a82beae540214d5abcbd0383): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.124602 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-b4g5j_openshift-operators(875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-b4g5j_openshift-operators(875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(3e7141b20b110a50044045c0f026a336c77f8c89a82beae540214d5abcbd0383): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" podUID="875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.156451 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5df5f8d6f4-7jrjf"] Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.157065 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.159194 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.161601 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" podStartSLOduration=8.161582165 podStartE2EDuration="8.161582165s" podCreationTimestamp="2026-03-20 13:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:11.156264249 +0000 UTC m=+690.665983225" watchObservedRunningTime="2026-03-20 13:33:11.161582165 +0000 UTC m=+690.671301131" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.163026 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-48f7q" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.172619 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.183067 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.268572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b024edb6-69fb-4c9f-927f-59a137df1c0f-openshift-service-ca\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.268853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8z4r\" (UniqueName: \"kubernetes.io/projected/b024edb6-69fb-4c9f-927f-59a137df1c0f-kube-api-access-w8z4r\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.268997 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-webhook-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.269153 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-apiservice-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.370104 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b024edb6-69fb-4c9f-927f-59a137df1c0f-openshift-service-ca\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.370149 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8z4r\" (UniqueName: \"kubernetes.io/projected/b024edb6-69fb-4c9f-927f-59a137df1c0f-kube-api-access-w8z4r\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.370208 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-webhook-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.370234 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-apiservice-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.371109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b024edb6-69fb-4c9f-927f-59a137df1c0f-openshift-service-ca\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.373920 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-webhook-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.373928 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b024edb6-69fb-4c9f-927f-59a137df1c0f-apiservice-cert\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.393230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8z4r\" (UniqueName: \"kubernetes.io/projected/b024edb6-69fb-4c9f-927f-59a137df1c0f-kube-api-access-w8z4r\") pod \"perses-operator-5df5f8d6f4-7jrjf\" (UID: \"b024edb6-69fb-4c9f-927f-59a137df1c0f\") " pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.474011 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.523819 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(3da44d725cea9192f0977da748fc8203203839bdf473d1476e203d51647b99a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.523891 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(3da44d725cea9192f0977da748fc8203203839bdf473d1476e203d51647b99a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.523922 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(3da44d725cea9192f0977da748fc8203203839bdf473d1476e203d51647b99a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.523966 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5df5f8d6f4-7jrjf_openshift-operators(b024edb6-69fb-4c9f-927f-59a137df1c0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5df5f8d6f4-7jrjf_openshift-operators(b024edb6-69fb-4c9f-927f-59a137df1c0f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(3da44d725cea9192f0977da748fc8203203839bdf473d1476e203d51647b99a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" podUID="b024edb6-69fb-4c9f-927f-59a137df1c0f" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.575634 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-25wpm"] Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.575745 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.576194 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.580811 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm"] Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.580897 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.581269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.593849 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd"] Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.593944 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.594319 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.601587 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5df5f8d6f4-7jrjf"] Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.616192 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(3e9b0f34aaf2e7dbb86bb346212134c1aae55f438c6fa02a151c87f31d2c1693): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.616360 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(3e9b0f34aaf2e7dbb86bb346212134c1aae55f438c6fa02a151c87f31d2c1693): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.616459 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(3e9b0f34aaf2e7dbb86bb346212134c1aae55f438c6fa02a151c87f31d2c1693): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.616548 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-25wpm_openshift-operators(94168b5c-bd4f-4ad1-a35e-6844e0416997)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-25wpm_openshift-operators(94168b5c-bd4f-4ad1-a35e-6844e0416997)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-25wpm_openshift-operators_94168b5c-bd4f-4ad1-a35e-6844e0416997_0(3e9b0f34aaf2e7dbb86bb346212134c1aae55f438c6fa02a151c87f31d2c1693): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" podUID="94168b5c-bd4f-4ad1-a35e-6844e0416997" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.644154 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(72759a21aab233876f4568b7e85f62d7f033c41957d61ab626f090d3623d99e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.644223 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(72759a21aab233876f4568b7e85f62d7f033c41957d61ab626f090d3623d99e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.644246 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(72759a21aab233876f4568b7e85f62d7f033c41957d61ab626f090d3623d99e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.644295 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators(e4559a68-4eab-4215-835d-37fc5f2ae439)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators(e4559a68-4eab-4215-835d-37fc5f2ae439)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_openshift-operators_e4559a68-4eab-4215-835d-37fc5f2ae439_0(72759a21aab233876f4568b7e85f62d7f033c41957d61ab626f090d3623d99e8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" podUID="e4559a68-4eab-4215-835d-37fc5f2ae439" Mar 20 13:33:11 crc kubenswrapper[4895]: I0320 13:33:11.646635 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-b4g5j"] Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.649582 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(4e011f144fc83d50a31c22cc7d45f0846737acb9c4597211d84e6e63d3f85d1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.649662 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(4e011f144fc83d50a31c22cc7d45f0846737acb9c4597211d84e6e63d3f85d1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.649686 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(4e011f144fc83d50a31c22cc7d45f0846737acb9c4597211d84e6e63d3f85d1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:11 crc kubenswrapper[4895]: E0320 13:33:11.649732 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators(029af23c-4a48-4160-bfd7-650a11a211dd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators(029af23c-4a48-4160-bfd7-650a11a211dd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_openshift-operators_029af23c-4a48-4160-bfd7-650a11a211dd_0(4e011f144fc83d50a31c22cc7d45f0846737acb9c4597211d84e6e63d3f85d1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" podUID="029af23c-4a48-4160-bfd7-650a11a211dd" Mar 20 13:33:12 crc kubenswrapper[4895]: I0320 13:33:12.123154 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:12 crc kubenswrapper[4895]: I0320 13:33:12.123177 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:12 crc kubenswrapper[4895]: I0320 13:33:12.123695 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:12 crc kubenswrapper[4895]: I0320 13:33:12.123865 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160265 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(65d6d8bbc739331e50ae1ef71461811d61d93001494715f681589288b4fb0aa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160297 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(b10b9c69b8f6526e0d2d7b64e250402aeb5f0fa9e444c1112db68ed69d02242e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160334 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(65d6d8bbc739331e50ae1ef71461811d61d93001494715f681589288b4fb0aa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160355 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(65d6d8bbc739331e50ae1ef71461811d61d93001494715f681589288b4fb0aa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160377 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(b10b9c69b8f6526e0d2d7b64e250402aeb5f0fa9e444c1112db68ed69d02242e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160420 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(b10b9c69b8f6526e0d2d7b64e250402aeb5f0fa9e444c1112db68ed69d02242e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160482 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5df5f8d6f4-7jrjf_openshift-operators(b024edb6-69fb-4c9f-927f-59a137df1c0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5df5f8d6f4-7jrjf_openshift-operators(b024edb6-69fb-4c9f-927f-59a137df1c0f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5df5f8d6f4-7jrjf_openshift-operators_b024edb6-69fb-4c9f-927f-59a137df1c0f_0(b10b9c69b8f6526e0d2d7b64e250402aeb5f0fa9e444c1112db68ed69d02242e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" podUID="b024edb6-69fb-4c9f-927f-59a137df1c0f" Mar 20 13:33:12 crc kubenswrapper[4895]: E0320 13:33:12.160412 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-b4g5j_openshift-operators(875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-b4g5j_openshift-operators(875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-b4g5j_openshift-operators_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b_0(65d6d8bbc739331e50ae1ef71461811d61d93001494715f681589288b4fb0aa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" podUID="875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b" Mar 20 13:33:24 crc kubenswrapper[4895]: I0320 13:33:24.210850 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:24 crc kubenswrapper[4895]: I0320 13:33:24.211780 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:24 crc kubenswrapper[4895]: I0320 13:33:24.420428 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5df5f8d6f4-7jrjf"] Mar 20 13:33:25 crc kubenswrapper[4895]: I0320 13:33:25.211334 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:25 crc kubenswrapper[4895]: I0320 13:33:25.212159 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" Mar 20 13:33:25 crc kubenswrapper[4895]: I0320 13:33:25.260411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" event={"ID":"b024edb6-69fb-4c9f-927f-59a137df1c0f","Type":"ContainerStarted","Data":"a875658b1a2adf7fedece53c2d531f9a27b63ae7b05e86c7b4c5c23f125138e9"} Mar 20 13:33:25 crc kubenswrapper[4895]: I0320 13:33:25.453185 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm"] Mar 20 13:33:25 crc kubenswrapper[4895]: W0320 13:33:25.460022 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029af23c_4a48_4160_bfd7_650a11a211dd.slice/crio-42c8227fc3c9e6deafbd8fecccbf3eeb64b08f0c5a1849c3f7a4db4a0109844f WatchSource:0}: Error finding container 42c8227fc3c9e6deafbd8fecccbf3eeb64b08f0c5a1849c3f7a4db4a0109844f: Status 404 returned error can't find the container with id 42c8227fc3c9e6deafbd8fecccbf3eeb64b08f0c5a1849c3f7a4db4a0109844f Mar 20 13:33:26 crc kubenswrapper[4895]: I0320 13:33:26.210949 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:26 crc kubenswrapper[4895]: I0320 13:33:26.211380 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" Mar 20 13:33:26 crc kubenswrapper[4895]: I0320 13:33:26.265782 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" event={"ID":"029af23c-4a48-4160-bfd7-650a11a211dd","Type":"ContainerStarted","Data":"42c8227fc3c9e6deafbd8fecccbf3eeb64b08f0c5a1849c3f7a4db4a0109844f"} Mar 20 13:33:26 crc kubenswrapper[4895]: I0320 13:33:26.405283 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd"] Mar 20 13:33:26 crc kubenswrapper[4895]: W0320 13:33:26.415078 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4559a68_4eab_4215_835d_37fc5f2ae439.slice/crio-df60549ac0217fe04e341bb2d6319f0cf86f9d54e61b12484d43b351dd52e285 WatchSource:0}: Error finding container df60549ac0217fe04e341bb2d6319f0cf86f9d54e61b12484d43b351dd52e285: Status 404 returned error can't find the container with id df60549ac0217fe04e341bb2d6319f0cf86f9d54e61b12484d43b351dd52e285 Mar 20 13:33:27 crc kubenswrapper[4895]: I0320 13:33:27.211418 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:27 crc kubenswrapper[4895]: I0320 13:33:27.211415 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:27 crc kubenswrapper[4895]: I0320 13:33:27.211819 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:27 crc kubenswrapper[4895]: I0320 13:33:27.212170 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" Mar 20 13:33:27 crc kubenswrapper[4895]: I0320 13:33:27.275417 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" event={"ID":"e4559a68-4eab-4215-835d-37fc5f2ae439","Type":"ContainerStarted","Data":"df60549ac0217fe04e341bb2d6319f0cf86f9d54e61b12484d43b351dd52e285"} Mar 20 13:33:29 crc kubenswrapper[4895]: I0320 13:33:29.199066 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-b4g5j"] Mar 20 13:33:29 crc kubenswrapper[4895]: W0320 13:33:29.273178 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875ae527_38d2_4d9c_bcdc_5ca7f9a9f17b.slice/crio-b2ddace9606dc1a5a79370c4306cb14d700e699e774d91729e5f26e9c8b4fe07 WatchSource:0}: Error finding container b2ddace9606dc1a5a79370c4306cb14d700e699e774d91729e5f26e9c8b4fe07: Status 404 returned error can't find the container with id b2ddace9606dc1a5a79370c4306cb14d700e699e774d91729e5f26e9c8b4fe07 Mar 20 13:33:29 crc kubenswrapper[4895]: I0320 13:33:29.292013 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" event={"ID":"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b","Type":"ContainerStarted","Data":"b2ddace9606dc1a5a79370c4306cb14d700e699e774d91729e5f26e9c8b4fe07"} Mar 20 13:33:29 crc kubenswrapper[4895]: I0320 13:33:29.454730 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-25wpm"] Mar 20 13:33:29 crc kubenswrapper[4895]: W0320 13:33:29.465057 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94168b5c_bd4f_4ad1_a35e_6844e0416997.slice/crio-3155746dad35b33eb002ce7a395283dde98fb722effbba402cd8f59f069bdc6c WatchSource:0}: Error finding container 3155746dad35b33eb002ce7a395283dde98fb722effbba402cd8f59f069bdc6c: Status 404 returned error can't find the container with id 3155746dad35b33eb002ce7a395283dde98fb722effbba402cd8f59f069bdc6c Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.301306 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" event={"ID":"029af23c-4a48-4160-bfd7-650a11a211dd","Type":"ContainerStarted","Data":"5a72757bb7a8eb4758e985fef9cbf2a98783e56e18b081cfbca02c3e22150bc4"} Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.304812 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" event={"ID":"e4559a68-4eab-4215-835d-37fc5f2ae439","Type":"ContainerStarted","Data":"7b65466330cbe2926d0f22575aacde9eaee5a4f2ecf947d495c17e99bb46973e"} Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.306490 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" event={"ID":"94168b5c-bd4f-4ad1-a35e-6844e0416997","Type":"ContainerStarted","Data":"3155746dad35b33eb002ce7a395283dde98fb722effbba402cd8f59f069bdc6c"} Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.308687 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" event={"ID":"b024edb6-69fb-4c9f-927f-59a137df1c0f","Type":"ContainerStarted","Data":"3c91943ed20e400432f933969b3d02e41b657e4bb119263b9d91a40ce297696d"} Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.309064 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.328075 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm" podStartSLOduration=16.43561204 podStartE2EDuration="20.328054945s" podCreationTimestamp="2026-03-20 13:33:10 +0000 UTC" firstStartedPulling="2026-03-20 13:33:25.46350054 +0000 UTC m=+704.973219516" lastFinishedPulling="2026-03-20 13:33:29.355943445 +0000 UTC m=+708.865662421" observedRunningTime="2026-03-20 13:33:30.324669375 +0000 UTC m=+709.834388351" watchObservedRunningTime="2026-03-20 13:33:30.328054945 +0000 UTC m=+709.837773901" Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.377118 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" podStartSLOduration=14.4609782 podStartE2EDuration="19.377087085s" podCreationTimestamp="2026-03-20 13:33:11 +0000 UTC" firstStartedPulling="2026-03-20 13:33:24.442475412 +0000 UTC m=+703.952194378" lastFinishedPulling="2026-03-20 13:33:29.358584277 +0000 UTC m=+708.868303263" observedRunningTime="2026-03-20 13:33:30.371756469 +0000 UTC m=+709.881475455" watchObservedRunningTime="2026-03-20 13:33:30.377087085 +0000 UTC m=+709.886806071" Mar 20 13:33:30 crc kubenswrapper[4895]: I0320 13:33:30.410453 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd" podStartSLOduration=17.470047985 podStartE2EDuration="20.410428995s" podCreationTimestamp="2026-03-20 13:33:10 +0000 UTC" firstStartedPulling="2026-03-20 13:33:26.419332994 +0000 UTC m=+705.929051960" lastFinishedPulling="2026-03-20 13:33:29.359713984 +0000 UTC m=+708.869432970" observedRunningTime="2026-03-20 13:33:30.408255213 +0000 UTC m=+709.917974179" watchObservedRunningTime="2026-03-20 13:33:30.410428995 +0000 UTC m=+709.920147981" Mar 20 13:33:33 crc kubenswrapper[4895]: I0320 13:33:33.717908 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6s5l" Mar 20 13:33:34 crc kubenswrapper[4895]: I0320 13:33:34.330545 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" event={"ID":"875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b","Type":"ContainerStarted","Data":"b292ef0fbe07291f0e095b09373887c29758b073cdd714d73456c30e0664e3e0"} Mar 20 13:33:34 crc kubenswrapper[4895]: I0320 13:33:34.330883 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:34 crc kubenswrapper[4895]: I0320 13:33:34.332070 4895 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-b4g5j container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" start-of-body= Mar 20 13:33:34 crc kubenswrapper[4895]: I0320 13:33:34.332130 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" podUID="875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" Mar 20 13:33:34 crc kubenswrapper[4895]: I0320 13:33:34.351339 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" podStartSLOduration=19.521430461 podStartE2EDuration="24.351313765s" podCreationTimestamp="2026-03-20 13:33:10 +0000 UTC" firstStartedPulling="2026-03-20 13:33:29.322225258 +0000 UTC m=+708.831944244" lastFinishedPulling="2026-03-20 13:33:34.152108582 +0000 UTC m=+713.661827548" observedRunningTime="2026-03-20 13:33:34.348078119 +0000 UTC m=+713.857797095" watchObservedRunningTime="2026-03-20 13:33:34.351313765 +0000 UTC m=+713.861032741" Mar 20 13:33:35 crc kubenswrapper[4895]: I0320 13:33:35.349136 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" event={"ID":"94168b5c-bd4f-4ad1-a35e-6844e0416997","Type":"ContainerStarted","Data":"392c73d815df4529eed3f4b127ec3711ef53d09e35b7a0c4fb04fb736a13ea4a"} Mar 20 13:33:35 crc kubenswrapper[4895]: I0320 13:33:35.351514 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-b4g5j" Mar 20 13:33:35 crc kubenswrapper[4895]: I0320 13:33:35.370922 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-25wpm" podStartSLOduration=20.714048428 podStartE2EDuration="25.37090247s" podCreationTimestamp="2026-03-20 13:33:10 +0000 UTC" firstStartedPulling="2026-03-20 13:33:29.469691116 +0000 UTC m=+708.979410082" lastFinishedPulling="2026-03-20 13:33:34.126545138 +0000 UTC m=+713.636264124" observedRunningTime="2026-03-20 13:33:35.367694343 +0000 UTC m=+714.877413309" watchObservedRunningTime="2026-03-20 13:33:35.37090247 +0000 UTC m=+714.880621446" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.479658 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5df5f8d6f4-7jrjf" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.789415 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6drpc"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.790556 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.793380 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-t9k9j" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.793441 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.793744 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.802083 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-h6r5b"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.802789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-h6r5b" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.804874 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-n6ckw" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.820183 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6drpc"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.832279 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dm9rw"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.833177 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.836008 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jx5mr" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.836658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vv5\" (UniqueName: \"kubernetes.io/projected/5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe-kube-api-access-n8vv5\") pod \"cert-manager-858654f9db-h6r5b\" (UID: \"5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe\") " pod="cert-manager/cert-manager-858654f9db-h6r5b" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.836732 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftrj\" (UniqueName: \"kubernetes.io/projected/fb60419b-e2e6-4f98-b5c8-846b4a670eb4-kube-api-access-6ftrj\") pod \"cert-manager-cainjector-cf98fcc89-6drpc\" (UID: \"fb60419b-e2e6-4f98-b5c8-846b4a670eb4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.839560 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dm9rw"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.842932 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-h6r5b"] Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.938023 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftrj\" (UniqueName: \"kubernetes.io/projected/fb60419b-e2e6-4f98-b5c8-846b4a670eb4-kube-api-access-6ftrj\") pod \"cert-manager-cainjector-cf98fcc89-6drpc\" (UID: \"fb60419b-e2e6-4f98-b5c8-846b4a670eb4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.938081 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vv5\" (UniqueName: \"kubernetes.io/projected/5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe-kube-api-access-n8vv5\") pod \"cert-manager-858654f9db-h6r5b\" (UID: \"5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe\") " pod="cert-manager/cert-manager-858654f9db-h6r5b" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.938117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsk9\" (UniqueName: \"kubernetes.io/projected/b5f5308f-82f4-432c-b1fa-b7a9554c691b-kube-api-access-rvsk9\") pod \"cert-manager-webhook-687f57d79b-dm9rw\" (UID: \"b5f5308f-82f4-432c-b1fa-b7a9554c691b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.957646 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftrj\" (UniqueName: \"kubernetes.io/projected/fb60419b-e2e6-4f98-b5c8-846b4a670eb4-kube-api-access-6ftrj\") pod \"cert-manager-cainjector-cf98fcc89-6drpc\" (UID: \"fb60419b-e2e6-4f98-b5c8-846b4a670eb4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" Mar 20 13:33:41 crc kubenswrapper[4895]: I0320 13:33:41.964282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vv5\" (UniqueName: \"kubernetes.io/projected/5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe-kube-api-access-n8vv5\") pod \"cert-manager-858654f9db-h6r5b\" (UID: \"5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe\") " pod="cert-manager/cert-manager-858654f9db-h6r5b" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.039333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsk9\" (UniqueName: \"kubernetes.io/projected/b5f5308f-82f4-432c-b1fa-b7a9554c691b-kube-api-access-rvsk9\") pod \"cert-manager-webhook-687f57d79b-dm9rw\" (UID: \"b5f5308f-82f4-432c-b1fa-b7a9554c691b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.061159 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsk9\" (UniqueName: \"kubernetes.io/projected/b5f5308f-82f4-432c-b1fa-b7a9554c691b-kube-api-access-rvsk9\") pod \"cert-manager-webhook-687f57d79b-dm9rw\" (UID: \"b5f5308f-82f4-432c-b1fa-b7a9554c691b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.120462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.140709 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-h6r5b" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.148044 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.409921 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-h6r5b"] Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.437130 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dm9rw"] Mar 20 13:33:42 crc kubenswrapper[4895]: I0320 13:33:42.558754 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6drpc"] Mar 20 13:33:42 crc kubenswrapper[4895]: W0320 13:33:42.562957 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb60419b_e2e6_4f98_b5c8_846b4a670eb4.slice/crio-b7a52baac890f2091e115f96ddfe1469494d59d5f5ba17f28a15fcbdc06c84ea WatchSource:0}: Error finding container b7a52baac890f2091e115f96ddfe1469494d59d5f5ba17f28a15fcbdc06c84ea: Status 404 returned error can't find the container with id b7a52baac890f2091e115f96ddfe1469494d59d5f5ba17f28a15fcbdc06c84ea Mar 20 13:33:43 crc kubenswrapper[4895]: I0320 13:33:43.397316 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" event={"ID":"b5f5308f-82f4-432c-b1fa-b7a9554c691b","Type":"ContainerStarted","Data":"07133b7ea14b3c36162e5f06628711c3fffd656252be5e89778c66c10a16f342"} Mar 20 13:33:43 crc kubenswrapper[4895]: I0320 13:33:43.399139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-h6r5b" event={"ID":"5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe","Type":"ContainerStarted","Data":"84ebe99b8f2bfce73a0d45ab1d9542533fe1cfaa6c1ecfd24b5648fc98a280a7"} Mar 20 13:33:43 crc kubenswrapper[4895]: I0320 13:33:43.400386 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" event={"ID":"fb60419b-e2e6-4f98-b5c8-846b4a670eb4","Type":"ContainerStarted","Data":"b7a52baac890f2091e115f96ddfe1469494d59d5f5ba17f28a15fcbdc06c84ea"} Mar 20 13:33:53 crc kubenswrapper[4895]: I0320 13:33:53.682261 4895 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:33:56 crc kubenswrapper[4895]: I0320 13:33:56.485119 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-h6r5b" event={"ID":"5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe","Type":"ContainerStarted","Data":"08fbbbeb03597d66dab382bb5b2372bbcb88a06b4439b0b671dda858d2ce6ce2"} Mar 20 13:33:56 crc kubenswrapper[4895]: I0320 13:33:56.486639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" event={"ID":"b5f5308f-82f4-432c-b1fa-b7a9554c691b","Type":"ContainerStarted","Data":"b4103baa80ba1d4c8b60b937c0f7c648b39ab2d33efcca5947e539118d508371"} Mar 20 13:33:56 crc kubenswrapper[4895]: I0320 13:33:56.486812 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:33:56 crc kubenswrapper[4895]: I0320 13:33:56.509294 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-h6r5b" podStartSLOduration=1.870552971 podStartE2EDuration="15.509275725s" podCreationTimestamp="2026-03-20 13:33:41 +0000 UTC" firstStartedPulling="2026-03-20 13:33:42.4162951 +0000 UTC m=+721.926014086" lastFinishedPulling="2026-03-20 13:33:56.055017874 +0000 UTC m=+735.564736840" observedRunningTime="2026-03-20 13:33:56.505545153 +0000 UTC m=+736.015264119" watchObservedRunningTime="2026-03-20 13:33:56.509275725 +0000 UTC m=+736.018994691" Mar 20 13:33:56 crc kubenswrapper[4895]: I0320 13:33:56.530319 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" podStartSLOduration=1.9199598450000002 podStartE2EDuration="15.530299332s" podCreationTimestamp="2026-03-20 13:33:41 +0000 UTC" firstStartedPulling="2026-03-20 13:33:42.449564458 +0000 UTC m=+721.959283414" lastFinishedPulling="2026-03-20 13:33:56.059903935 +0000 UTC m=+735.569622901" observedRunningTime="2026-03-20 13:33:56.526921179 +0000 UTC m=+736.036640145" watchObservedRunningTime="2026-03-20 13:33:56.530299332 +0000 UTC m=+736.040018298" Mar 20 13:33:57 crc kubenswrapper[4895]: I0320 13:33:57.494812 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" event={"ID":"fb60419b-e2e6-4f98-b5c8-846b4a670eb4","Type":"ContainerStarted","Data":"d9d384be238a0e497510e2cdc0518610d66127c1fd1fc95cacf60fdcf3f2042c"} Mar 20 13:33:57 crc kubenswrapper[4895]: I0320 13:33:57.518510 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6drpc" podStartSLOduration=2.282015078 podStartE2EDuration="16.518479522s" podCreationTimestamp="2026-03-20 13:33:41 +0000 UTC" firstStartedPulling="2026-03-20 13:33:42.565418577 +0000 UTC m=+722.075137543" lastFinishedPulling="2026-03-20 13:33:56.801883021 +0000 UTC m=+736.311601987" observedRunningTime="2026-03-20 13:33:57.514047664 +0000 UTC m=+737.023766620" watchObservedRunningTime="2026-03-20 13:33:57.518479522 +0000 UTC m=+737.028198558" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.139671 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-gpgqn"] Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.140463 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.144785 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.144843 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.146301 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.153365 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-gpgqn"] Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.192531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5jv\" (UniqueName: \"kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv\") pod \"auto-csr-approver-29566894-gpgqn\" (UID: \"dc86faca-eea0-4505-a809-8cbbdb6342fa\") " pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.295694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc5jv\" (UniqueName: \"kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv\") pod \"auto-csr-approver-29566894-gpgqn\" (UID: \"dc86faca-eea0-4505-a809-8cbbdb6342fa\") " pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.329332 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc5jv\" (UniqueName: \"kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv\") pod \"auto-csr-approver-29566894-gpgqn\" (UID: \"dc86faca-eea0-4505-a809-8cbbdb6342fa\") " pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.467146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:00 crc kubenswrapper[4895]: I0320 13:34:00.922442 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-gpgqn"] Mar 20 13:34:00 crc kubenswrapper[4895]: W0320 13:34:00.926121 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc86faca_eea0_4505_a809_8cbbdb6342fa.slice/crio-5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735 WatchSource:0}: Error finding container 5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735: Status 404 returned error can't find the container with id 5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735 Mar 20 13:34:01 crc kubenswrapper[4895]: I0320 13:34:01.523559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" event={"ID":"dc86faca-eea0-4505-a809-8cbbdb6342fa","Type":"ContainerStarted","Data":"5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735"} Mar 20 13:34:02 crc kubenswrapper[4895]: I0320 13:34:02.151432 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dm9rw" Mar 20 13:34:03 crc kubenswrapper[4895]: I0320 13:34:03.541174 4895 generic.go:334] "Generic (PLEG): container finished" podID="dc86faca-eea0-4505-a809-8cbbdb6342fa" containerID="6acb94277af9cadcf887066793e5e4239f8aab422e70f38444289f147f2918bf" exitCode=0 Mar 20 13:34:03 crc kubenswrapper[4895]: I0320 13:34:03.541226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" event={"ID":"dc86faca-eea0-4505-a809-8cbbdb6342fa","Type":"ContainerDied","Data":"6acb94277af9cadcf887066793e5e4239f8aab422e70f38444289f147f2918bf"} Mar 20 13:34:04 crc kubenswrapper[4895]: I0320 13:34:04.786870 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:04 crc kubenswrapper[4895]: I0320 13:34:04.866327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc5jv\" (UniqueName: \"kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv\") pod \"dc86faca-eea0-4505-a809-8cbbdb6342fa\" (UID: \"dc86faca-eea0-4505-a809-8cbbdb6342fa\") " Mar 20 13:34:04 crc kubenswrapper[4895]: I0320 13:34:04.877854 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv" (OuterVolumeSpecName: "kube-api-access-nc5jv") pod "dc86faca-eea0-4505-a809-8cbbdb6342fa" (UID: "dc86faca-eea0-4505-a809-8cbbdb6342fa"). InnerVolumeSpecName "kube-api-access-nc5jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:04 crc kubenswrapper[4895]: I0320 13:34:04.968820 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc5jv\" (UniqueName: \"kubernetes.io/projected/dc86faca-eea0-4505-a809-8cbbdb6342fa-kube-api-access-nc5jv\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:05 crc kubenswrapper[4895]: I0320 13:34:05.555497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" event={"ID":"dc86faca-eea0-4505-a809-8cbbdb6342fa","Type":"ContainerDied","Data":"5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735"} Mar 20 13:34:05 crc kubenswrapper[4895]: I0320 13:34:05.555834 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2a159aead35a066b21c0382dd4a4643c995d521bc5d739903e589fb0dd9735" Mar 20 13:34:05 crc kubenswrapper[4895]: I0320 13:34:05.555543 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-gpgqn" Mar 20 13:34:05 crc kubenswrapper[4895]: I0320 13:34:05.838896 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-vt8mx"] Mar 20 13:34:05 crc kubenswrapper[4895]: I0320 13:34:05.845731 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-vt8mx"] Mar 20 13:34:07 crc kubenswrapper[4895]: I0320 13:34:07.222378 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83f6aaf-7a04-4611-8654-826c470c1f94" path="/var/lib/kubelet/pods/c83f6aaf-7a04-4611-8654-826c470c1f94/volumes" Mar 20 13:34:22 crc kubenswrapper[4895]: I0320 13:34:22.296700 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:22 crc kubenswrapper[4895]: I0320 13:34:22.297377 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.973152 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz"] Mar 20 13:34:24 crc kubenswrapper[4895]: E0320 13:34:24.973743 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc86faca-eea0-4505-a809-8cbbdb6342fa" containerName="oc" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.973759 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc86faca-eea0-4505-a809-8cbbdb6342fa" containerName="oc" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.973888 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc86faca-eea0-4505-a809-8cbbdb6342fa" containerName="oc" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.974844 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.977716 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:34:24 crc kubenswrapper[4895]: I0320 13:34:24.990998 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz"] Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.063037 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvb48\" (UniqueName: \"kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.063108 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.063149 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.164867 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.165568 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvb48\" (UniqueName: \"kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.165751 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.166067 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.166456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.201446 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvb48\" (UniqueName: \"kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.293640 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:25 crc kubenswrapper[4895]: I0320 13:34:25.727599 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz"] Mar 20 13:34:25 crc kubenswrapper[4895]: W0320 13:34:25.735868 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec8c6d6_c364_4bb7_aea4_931b5b4774e1.slice/crio-0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60 WatchSource:0}: Error finding container 0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60: Status 404 returned error can't find the container with id 0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60 Mar 20 13:34:26 crc kubenswrapper[4895]: I0320 13:34:26.717025 4895 generic.go:334] "Generic (PLEG): container finished" podID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerID="f7d12e00885791b0a02e7c814c6574406513fe62c4a5580f26a708549b5fd801" exitCode=0 Mar 20 13:34:26 crc kubenswrapper[4895]: I0320 13:34:26.717079 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" event={"ID":"cec8c6d6-c364-4bb7-aea4-931b5b4774e1","Type":"ContainerDied","Data":"f7d12e00885791b0a02e7c814c6574406513fe62c4a5580f26a708549b5fd801"} Mar 20 13:34:26 crc kubenswrapper[4895]: I0320 13:34:26.717140 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" event={"ID":"cec8c6d6-c364-4bb7-aea4-931b5b4774e1","Type":"ContainerStarted","Data":"0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60"} Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.293739 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.295786 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.299703 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.300239 4895 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-htntk" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.300570 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.302864 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.333093 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.334561 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.351249 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.395590 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.395706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxh2b\" (UniqueName: \"kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.395753 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.395805 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.395833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6n6v\" (UniqueName: \"kubernetes.io/projected/611e525b-65c7-4184-9755-cee29b2a10c1-kube-api-access-d6n6v\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497118 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxh2b\" (UniqueName: \"kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497180 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497217 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497241 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6n6v\" (UniqueName: \"kubernetes.io/projected/611e525b-65c7-4184-9755-cee29b2a10c1-kube-api-access-d6n6v\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497284 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.497739 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.498160 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.500048 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.500094 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/25855728d4bb53af39995366c7f68f56fd7bb17ea84435d1f83e5fe294c0e139/globalmount\"" pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.526968 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-038dff8c-5b1e-4f3a-80a7-391f92cf0591\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.528503 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6n6v\" (UniqueName: \"kubernetes.io/projected/611e525b-65c7-4184-9755-cee29b2a10c1-kube-api-access-d6n6v\") pod \"minio\" (UID: \"611e525b-65c7-4184-9755-cee29b2a10c1\") " pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.529608 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxh2b\" (UniqueName: \"kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b\") pod \"redhat-operators-t4msb\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.618901 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.656911 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:27 crc kubenswrapper[4895]: I0320 13:34:27.899174 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.124018 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 20 13:34:28 crc kubenswrapper[4895]: W0320 13:34:28.124338 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod611e525b_65c7_4184_9755_cee29b2a10c1.slice/crio-a68eb9bfa9422aa4e2c05c42e0767ba17f55c82457af72e41ba9d05a51a2c928 WatchSource:0}: Error finding container a68eb9bfa9422aa4e2c05c42e0767ba17f55c82457af72e41ba9d05a51a2c928: Status 404 returned error can't find the container with id a68eb9bfa9422aa4e2c05c42e0767ba17f55c82457af72e41ba9d05a51a2c928 Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.733698 4895 generic.go:334] "Generic (PLEG): container finished" podID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerID="1c72aa6b8361bd39f95eed749bac542b1aca116671520f1745e39cedf5797766" exitCode=0 Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.733756 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" event={"ID":"cec8c6d6-c364-4bb7-aea4-931b5b4774e1","Type":"ContainerDied","Data":"1c72aa6b8361bd39f95eed749bac542b1aca116671520f1745e39cedf5797766"} Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.735251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"611e525b-65c7-4184-9755-cee29b2a10c1","Type":"ContainerStarted","Data":"a68eb9bfa9422aa4e2c05c42e0767ba17f55c82457af72e41ba9d05a51a2c928"} Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.737743 4895 generic.go:334] "Generic (PLEG): container finished" podID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerID="bb25d7f6ef1a327a4754b6ec0251796e316055dc3864465a22208b744077d16f" exitCode=0 Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.737806 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerDied","Data":"bb25d7f6ef1a327a4754b6ec0251796e316055dc3864465a22208b744077d16f"} Mar 20 13:34:28 crc kubenswrapper[4895]: I0320 13:34:28.737838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerStarted","Data":"e20d75d15906662de31a218b8c398dd4e6ceb633ba0796167ed78511b9ee1c5d"} Mar 20 13:34:29 crc kubenswrapper[4895]: I0320 13:34:29.746236 4895 generic.go:334] "Generic (PLEG): container finished" podID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerID="276eff37e6e22b8cc22dfc883cf4d849f92e744bbc9148869581f4ff76bc73b7" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4895]: I0320 13:34:29.746486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" event={"ID":"cec8c6d6-c364-4bb7-aea4-931b5b4774e1","Type":"ContainerDied","Data":"276eff37e6e22b8cc22dfc883cf4d849f92e744bbc9148869581f4ff76bc73b7"} Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.290688 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.344091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle\") pod \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.344210 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvb48\" (UniqueName: \"kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48\") pod \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.344302 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util\") pod \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\" (UID: \"cec8c6d6-c364-4bb7-aea4-931b5b4774e1\") " Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.345670 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle" (OuterVolumeSpecName: "bundle") pod "cec8c6d6-c364-4bb7-aea4-931b5b4774e1" (UID: "cec8c6d6-c364-4bb7-aea4-931b5b4774e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.345956 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.350461 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48" (OuterVolumeSpecName: "kube-api-access-bvb48") pod "cec8c6d6-c364-4bb7-aea4-931b5b4774e1" (UID: "cec8c6d6-c364-4bb7-aea4-931b5b4774e1"). InnerVolumeSpecName "kube-api-access-bvb48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.365311 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util" (OuterVolumeSpecName: "util") pod "cec8c6d6-c364-4bb7-aea4-931b5b4774e1" (UID: "cec8c6d6-c364-4bb7-aea4-931b5b4774e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.447097 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvb48\" (UniqueName: \"kubernetes.io/projected/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-kube-api-access-bvb48\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.447130 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cec8c6d6-c364-4bb7-aea4-931b5b4774e1-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.762804 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" event={"ID":"cec8c6d6-c364-4bb7-aea4-931b5b4774e1","Type":"ContainerDied","Data":"0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60"} Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.762864 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0534b2f0d80d506f94a85a5167aaed822597d14544e4ef2d3bb6c587e2a6bb60" Mar 20 13:34:31 crc kubenswrapper[4895]: I0320 13:34:31.762960 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz" Mar 20 13:34:33 crc kubenswrapper[4895]: I0320 13:34:33.774334 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"611e525b-65c7-4184-9755-cee29b2a10c1","Type":"ContainerStarted","Data":"757b3e18031c657394b36f84df6f0a6222be3043aad815ad312449f35b403927"} Mar 20 13:34:33 crc kubenswrapper[4895]: I0320 13:34:33.775820 4895 generic.go:334] "Generic (PLEG): container finished" podID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerID="0989bdf38627277aa7335e726159ceacd538dd390aa920a9a23c30fd9a3abf80" exitCode=0 Mar 20 13:34:33 crc kubenswrapper[4895]: I0320 13:34:33.775856 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerDied","Data":"0989bdf38627277aa7335e726159ceacd538dd390aa920a9a23c30fd9a3abf80"} Mar 20 13:34:33 crc kubenswrapper[4895]: I0320 13:34:33.788365 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.242628267 podStartE2EDuration="9.78834548s" podCreationTimestamp="2026-03-20 13:34:24 +0000 UTC" firstStartedPulling="2026-03-20 13:34:28.127293601 +0000 UTC m=+767.637012567" lastFinishedPulling="2026-03-20 13:34:32.673010814 +0000 UTC m=+772.182729780" observedRunningTime="2026-03-20 13:34:33.786625418 +0000 UTC m=+773.296344394" watchObservedRunningTime="2026-03-20 13:34:33.78834548 +0000 UTC m=+773.298064446" Mar 20 13:34:34 crc kubenswrapper[4895]: I0320 13:34:34.783888 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerStarted","Data":"3c11038ee2d16beeb430d3af073958c61f66b7ee1fea3e8bd7889c8d915c86be"} Mar 20 13:34:34 crc kubenswrapper[4895]: I0320 13:34:34.830171 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t4msb" podStartSLOduration=2.279310789 podStartE2EDuration="7.830154739s" podCreationTimestamp="2026-03-20 13:34:27 +0000 UTC" firstStartedPulling="2026-03-20 13:34:28.752467434 +0000 UTC m=+768.262186400" lastFinishedPulling="2026-03-20 13:34:34.303311354 +0000 UTC m=+773.813030350" observedRunningTime="2026-03-20 13:34:34.827487763 +0000 UTC m=+774.337206739" watchObservedRunningTime="2026-03-20 13:34:34.830154739 +0000 UTC m=+774.339873705" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.952179 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d"] Mar 20 13:34:36 crc kubenswrapper[4895]: E0320 13:34:36.952735 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="extract" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.952747 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="extract" Mar 20 13:34:36 crc kubenswrapper[4895]: E0320 13:34:36.952761 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="util" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.952768 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="util" Mar 20 13:34:36 crc kubenswrapper[4895]: E0320 13:34:36.952781 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="pull" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.952787 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="pull" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.952892 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec8c6d6-c364-4bb7-aea4-931b5b4774e1" containerName="extract" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.953489 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.957696 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.957720 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.958289 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.958872 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-xkt9q" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.959597 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.959975 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 20 13:34:36 crc kubenswrapper[4895]: I0320 13:34:36.970373 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d"] Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.138474 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-webhook-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.138553 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-apiservice-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.138602 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-manager-config\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.138644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjk8q\" (UniqueName: \"kubernetes.io/projected/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-kube-api-access-jjk8q\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.138765 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.240014 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-apiservice-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.240089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-manager-config\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.240129 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjk8q\" (UniqueName: \"kubernetes.io/projected/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-kube-api-access-jjk8q\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.240154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.240202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-webhook-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.241178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-manager-config\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.245664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.246116 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-apiservice-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.248149 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-webhook-cert\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.258846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjk8q\" (UniqueName: \"kubernetes.io/projected/bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca-kube-api-access-jjk8q\") pod \"loki-operator-controller-manager-dd7dbfbcf-qkd9d\" (UID: \"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.269327 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.502514 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d"] Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.657410 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.657458 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:37 crc kubenswrapper[4895]: I0320 13:34:37.799378 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" event={"ID":"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca","Type":"ContainerStarted","Data":"76e59bebe6f394a76164a6e3cf57bea4f0087ac6ac938257eb224b12ffc17915"} Mar 20 13:34:38 crc kubenswrapper[4895]: I0320 13:34:38.706666 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4msb" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="registry-server" probeResult="failure" output=< Mar 20 13:34:38 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:34:38 crc kubenswrapper[4895]: > Mar 20 13:34:41 crc kubenswrapper[4895]: I0320 13:34:41.929662 4895 scope.go:117] "RemoveContainer" containerID="d95b253296ff25bab80900db298daca34b7737467616fe6c1f617c22adedcacf" Mar 20 13:34:43 crc kubenswrapper[4895]: I0320 13:34:43.831949 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" event={"ID":"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca","Type":"ContainerStarted","Data":"8932c4380ecd2a5bc507b833295240adf0afa9910e4927a0c96e1b7508ecd49e"} Mar 20 13:34:47 crc kubenswrapper[4895]: I0320 13:34:47.712554 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:47 crc kubenswrapper[4895]: I0320 13:34:47.750773 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:50 crc kubenswrapper[4895]: I0320 13:34:50.113963 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:50 crc kubenswrapper[4895]: I0320 13:34:50.114836 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t4msb" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="registry-server" containerID="cri-o://3c11038ee2d16beeb430d3af073958c61f66b7ee1fea3e8bd7889c8d915c86be" gracePeriod=2 Mar 20 13:34:50 crc kubenswrapper[4895]: I0320 13:34:50.877866 4895 generic.go:334] "Generic (PLEG): container finished" podID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerID="3c11038ee2d16beeb430d3af073958c61f66b7ee1fea3e8bd7889c8d915c86be" exitCode=0 Mar 20 13:34:50 crc kubenswrapper[4895]: I0320 13:34:50.877941 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerDied","Data":"3c11038ee2d16beeb430d3af073958c61f66b7ee1fea3e8bd7889c8d915c86be"} Mar 20 13:34:50 crc kubenswrapper[4895]: I0320 13:34:50.879882 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" event={"ID":"bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca","Type":"ContainerStarted","Data":"741584ccc257fa8e6eae15c96315d9b8a0ec2dccb2fc6e3a273f4188b24ed9c1"} Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.610107 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.766758 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content\") pod \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.766848 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities\") pod \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.766909 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxh2b\" (UniqueName: \"kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b\") pod \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\" (UID: \"bc8b6e88-90b7-4092-8c31-7a0cb2457150\") " Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.768770 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities" (OuterVolumeSpecName: "utilities") pod "bc8b6e88-90b7-4092-8c31-7a0cb2457150" (UID: "bc8b6e88-90b7-4092-8c31-7a0cb2457150"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.772110 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b" (OuterVolumeSpecName: "kube-api-access-wxh2b") pod "bc8b6e88-90b7-4092-8c31-7a0cb2457150" (UID: "bc8b6e88-90b7-4092-8c31-7a0cb2457150"). InnerVolumeSpecName "kube-api-access-wxh2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.868842 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.868888 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxh2b\" (UniqueName: \"kubernetes.io/projected/bc8b6e88-90b7-4092-8c31-7a0cb2457150-kube-api-access-wxh2b\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.889902 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4msb" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.890083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4msb" event={"ID":"bc8b6e88-90b7-4092-8c31-7a0cb2457150","Type":"ContainerDied","Data":"e20d75d15906662de31a218b8c398dd4e6ceb633ba0796167ed78511b9ee1c5d"} Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.890116 4895 scope.go:117] "RemoveContainer" containerID="3c11038ee2d16beeb430d3af073958c61f66b7ee1fea3e8bd7889c8d915c86be" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.890444 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.893253 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.900959 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc8b6e88-90b7-4092-8c31-7a0cb2457150" (UID: "bc8b6e88-90b7-4092-8c31-7a0cb2457150"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.910536 4895 scope.go:117] "RemoveContainer" containerID="0989bdf38627277aa7335e726159ceacd538dd390aa920a9a23c30fd9a3abf80" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.942638 4895 scope.go:117] "RemoveContainer" containerID="bb25d7f6ef1a327a4754b6ec0251796e316055dc3864465a22208b744077d16f" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.957155 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-dd7dbfbcf-qkd9d" podStartSLOduration=3.630297904 podStartE2EDuration="15.957131694s" podCreationTimestamp="2026-03-20 13:34:36 +0000 UTC" firstStartedPulling="2026-03-20 13:34:37.52150425 +0000 UTC m=+777.031223216" lastFinishedPulling="2026-03-20 13:34:49.84833804 +0000 UTC m=+789.358057006" observedRunningTime="2026-03-20 13:34:51.92599081 +0000 UTC m=+791.435709786" watchObservedRunningTime="2026-03-20 13:34:51.957131694 +0000 UTC m=+791.466850680" Mar 20 13:34:51 crc kubenswrapper[4895]: I0320 13:34:51.970574 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc8b6e88-90b7-4092-8c31-7a0cb2457150-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:52 crc kubenswrapper[4895]: I0320 13:34:52.226711 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:52 crc kubenswrapper[4895]: I0320 13:34:52.233945 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t4msb"] Mar 20 13:34:52 crc kubenswrapper[4895]: I0320 13:34:52.297318 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:52 crc kubenswrapper[4895]: I0320 13:34:52.297630 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:53 crc kubenswrapper[4895]: I0320 13:34:53.220654 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" path="/var/lib/kubelet/pods/bc8b6e88-90b7-4092-8c31-7a0cb2457150/volumes" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.945944 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k"] Mar 20 13:35:21 crc kubenswrapper[4895]: E0320 13:35:21.946531 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="extract-utilities" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.946542 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="extract-utilities" Mar 20 13:35:21 crc kubenswrapper[4895]: E0320 13:35:21.946560 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="registry-server" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.946566 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="registry-server" Mar 20 13:35:21 crc kubenswrapper[4895]: E0320 13:35:21.946575 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="extract-content" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.946581 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="extract-content" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.946682 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8b6e88-90b7-4092-8c31-7a0cb2457150" containerName="registry-server" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.947544 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.949294 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:35:21 crc kubenswrapper[4895]: I0320 13:35:21.959486 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k"] Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.053710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.053816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprnn\" (UniqueName: \"kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.053870 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.154942 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprnn\" (UniqueName: \"kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.154994 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.155027 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.155486 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.155604 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.184541 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprnn\" (UniqueName: \"kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.269017 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.297062 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.297136 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.297192 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.297884 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.297975 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85" gracePeriod=600 Mar 20 13:35:22 crc kubenswrapper[4895]: I0320 13:35:22.504267 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k"] Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.150017 4895 generic.go:334] "Generic (PLEG): container finished" podID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerID="ee2bf730a07c29da86e5ce6714903a357261d792b2ea0c373f39a658f0314faa" exitCode=0 Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.150275 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" event={"ID":"bbb0cccd-3628-4791-a0e2-c4042fb00e33","Type":"ContainerDied","Data":"ee2bf730a07c29da86e5ce6714903a357261d792b2ea0c373f39a658f0314faa"} Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.150299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" event={"ID":"bbb0cccd-3628-4791-a0e2-c4042fb00e33","Type":"ContainerStarted","Data":"cddb5d51ab214708b82e9bfd4b5e8d39af71878c9aef67ea2de5369a6cdc3b4d"} Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.151741 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.154608 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85" exitCode=0 Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.154636 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85"} Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.154657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6"} Mar 20 13:35:23 crc kubenswrapper[4895]: I0320 13:35:23.154674 4895 scope.go:117] "RemoveContainer" containerID="2f0b1ce387ae71eaad93fa01e959a773aef89619d10cc1be529c2bb967dfceee" Mar 20 13:35:25 crc kubenswrapper[4895]: I0320 13:35:25.169195 4895 generic.go:334] "Generic (PLEG): container finished" podID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerID="d6a3e1018a24c8111bf7a41220a28e20f3e7cf138650719e688e97d6540d7bd3" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4895]: I0320 13:35:25.169404 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" event={"ID":"bbb0cccd-3628-4791-a0e2-c4042fb00e33","Type":"ContainerDied","Data":"d6a3e1018a24c8111bf7a41220a28e20f3e7cf138650719e688e97d6540d7bd3"} Mar 20 13:35:26 crc kubenswrapper[4895]: I0320 13:35:26.176607 4895 generic.go:334] "Generic (PLEG): container finished" podID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerID="58bbb6cacdd2f5ce2cd082765864dbe24340af3a64f7d393d18c75b72468ff0d" exitCode=0 Mar 20 13:35:26 crc kubenswrapper[4895]: I0320 13:35:26.176667 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" event={"ID":"bbb0cccd-3628-4791-a0e2-c4042fb00e33","Type":"ContainerDied","Data":"58bbb6cacdd2f5ce2cd082765864dbe24340af3a64f7d393d18c75b72468ff0d"} Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.401510 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.523822 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle\") pod \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.523886 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util\") pod \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.523917 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprnn\" (UniqueName: \"kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn\") pod \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\" (UID: \"bbb0cccd-3628-4791-a0e2-c4042fb00e33\") " Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.524823 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle" (OuterVolumeSpecName: "bundle") pod "bbb0cccd-3628-4791-a0e2-c4042fb00e33" (UID: "bbb0cccd-3628-4791-a0e2-c4042fb00e33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.534565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn" (OuterVolumeSpecName: "kube-api-access-jprnn") pod "bbb0cccd-3628-4791-a0e2-c4042fb00e33" (UID: "bbb0cccd-3628-4791-a0e2-c4042fb00e33"). InnerVolumeSpecName "kube-api-access-jprnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.538347 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util" (OuterVolumeSpecName: "util") pod "bbb0cccd-3628-4791-a0e2-c4042fb00e33" (UID: "bbb0cccd-3628-4791-a0e2-c4042fb00e33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.625711 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.625767 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbb0cccd-3628-4791-a0e2-c4042fb00e33-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:27 crc kubenswrapper[4895]: I0320 13:35:27.625787 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprnn\" (UniqueName: \"kubernetes.io/projected/bbb0cccd-3628-4791-a0e2-c4042fb00e33-kube-api-access-jprnn\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:28 crc kubenswrapper[4895]: I0320 13:35:28.190097 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" event={"ID":"bbb0cccd-3628-4791-a0e2-c4042fb00e33","Type":"ContainerDied","Data":"cddb5d51ab214708b82e9bfd4b5e8d39af71878c9aef67ea2de5369a6cdc3b4d"} Mar 20 13:35:28 crc kubenswrapper[4895]: I0320 13:35:28.190136 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cddb5d51ab214708b82e9bfd4b5e8d39af71878c9aef67ea2de5369a6cdc3b4d" Mar 20 13:35:28 crc kubenswrapper[4895]: I0320 13:35:28.190162 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.696267 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q"] Mar 20 13:35:31 crc kubenswrapper[4895]: E0320 13:35:31.697154 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="pull" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.697169 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="pull" Mar 20 13:35:31 crc kubenswrapper[4895]: E0320 13:35:31.697184 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="extract" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.697191 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="extract" Mar 20 13:35:31 crc kubenswrapper[4895]: E0320 13:35:31.697200 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="util" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.697208 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="util" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.697325 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb0cccd-3628-4791-a0e2-c4042fb00e33" containerName="extract" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.697903 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.700151 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.701160 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.701275 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-sv6k2" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.701772 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q"] Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.782035 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjdp\" (UniqueName: \"kubernetes.io/projected/223467bd-bc95-4b07-96d3-0cda20ea5a3c-kube-api-access-xzjdp\") pod \"nmstate-operator-796d4cfff4-cxq4q\" (UID: \"223467bd-bc95-4b07-96d3-0cda20ea5a3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.884146 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjdp\" (UniqueName: \"kubernetes.io/projected/223467bd-bc95-4b07-96d3-0cda20ea5a3c-kube-api-access-xzjdp\") pod \"nmstate-operator-796d4cfff4-cxq4q\" (UID: \"223467bd-bc95-4b07-96d3-0cda20ea5a3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" Mar 20 13:35:31 crc kubenswrapper[4895]: I0320 13:35:31.911470 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjdp\" (UniqueName: \"kubernetes.io/projected/223467bd-bc95-4b07-96d3-0cda20ea5a3c-kube-api-access-xzjdp\") pod \"nmstate-operator-796d4cfff4-cxq4q\" (UID: \"223467bd-bc95-4b07-96d3-0cda20ea5a3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" Mar 20 13:35:32 crc kubenswrapper[4895]: I0320 13:35:32.015370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" Mar 20 13:35:32 crc kubenswrapper[4895]: I0320 13:35:32.309296 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q"] Mar 20 13:35:33 crc kubenswrapper[4895]: I0320 13:35:33.224125 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" event={"ID":"223467bd-bc95-4b07-96d3-0cda20ea5a3c","Type":"ContainerStarted","Data":"29207b544a1fcad16fd8e5ad2a2abf590c33362da783c9cd273dfde1bb9e4654"} Mar 20 13:35:35 crc kubenswrapper[4895]: I0320 13:35:35.237376 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" event={"ID":"223467bd-bc95-4b07-96d3-0cda20ea5a3c","Type":"ContainerStarted","Data":"d05391acf30b6284eea8421c0f5a2139d97e9a141a6c653425d264e19665b46b"} Mar 20 13:35:35 crc kubenswrapper[4895]: I0320 13:35:35.260240 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cxq4q" podStartSLOduration=1.732675408 podStartE2EDuration="4.260218046s" podCreationTimestamp="2026-03-20 13:35:31 +0000 UTC" firstStartedPulling="2026-03-20 13:35:32.312248732 +0000 UTC m=+831.821967698" lastFinishedPulling="2026-03-20 13:35:34.83979137 +0000 UTC m=+834.349510336" observedRunningTime="2026-03-20 13:35:35.254201646 +0000 UTC m=+834.763920652" watchObservedRunningTime="2026-03-20 13:35:35.260218046 +0000 UTC m=+834.769937012" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.000883 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.002693 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.004249 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7qm8f" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.006681 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.007480 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.008857 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.093566 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fmpgx"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.094681 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.100718 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.104526 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.172573 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.172806 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfxn\" (UniqueName: \"kubernetes.io/projected/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-kube-api-access-mlfxn\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.172918 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltxw\" (UniqueName: \"kubernetes.io/projected/44daea46-7374-461a-9926-c66cb642296d-kube-api-access-nltxw\") pod \"nmstate-metrics-9b8c8685d-zrhs4\" (UID: \"44daea46-7374-461a-9926-c66cb642296d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.218971 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.219623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.221472 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.221473 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.222074 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.222266 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jvmcf" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274139 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nltxw\" (UniqueName: \"kubernetes.io/projected/44daea46-7374-461a-9926-c66cb642296d-kube-api-access-nltxw\") pod \"nmstate-metrics-9b8c8685d-zrhs4\" (UID: \"44daea46-7374-461a-9926-c66cb642296d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274204 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-ovs-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274506 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj2cr\" (UniqueName: \"kubernetes.io/projected/0a5d87c8-dc19-4250-b749-8827eb2de72f-kube-api-access-lj2cr\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274545 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-dbus-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274585 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274609 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfxn\" (UniqueName: \"kubernetes.io/projected/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-kube-api-access-mlfxn\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.274626 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-nmstate-lock\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.280430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.288796 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltxw\" (UniqueName: \"kubernetes.io/projected/44daea46-7374-461a-9926-c66cb642296d-kube-api-access-nltxw\") pod \"nmstate-metrics-9b8c8685d-zrhs4\" (UID: \"44daea46-7374-461a-9926-c66cb642296d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.290796 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfxn\" (UniqueName: \"kubernetes.io/projected/7d6c7d25-f568-4cde-9716-c5fa4b4747b7-kube-api-access-mlfxn\") pod \"nmstate-webhook-5f558f5558-xmmr9\" (UID: \"7d6c7d25-f568-4cde-9716-c5fa4b4747b7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.374619 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.375521 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnsqh\" (UniqueName: \"kubernetes.io/projected/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-kube-api-access-vnsqh\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.375646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-dbus-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.375877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-dbus-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.381676 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.386381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.386537 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-nmstate-lock\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.386841 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.386980 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-ovs-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.387089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj2cr\" (UniqueName: \"kubernetes.io/projected/0a5d87c8-dc19-4250-b749-8827eb2de72f-kube-api-access-lj2cr\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.387123 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-ovs-socket\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.386642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a5d87c8-dc19-4250-b749-8827eb2de72f-nmstate-lock\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.395680 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fcc5cf8cf-zpjsj"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.396566 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.411047 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj2cr\" (UniqueName: \"kubernetes.io/projected/0a5d87c8-dc19-4250-b749-8827eb2de72f-kube-api-access-lj2cr\") pod \"nmstate-handler-fmpgx\" (UID: \"0a5d87c8-dc19-4250-b749-8827eb2de72f\") " pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.417096 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fcc5cf8cf-zpjsj"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489518 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-oauth-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-console-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489619 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489647 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-trusted-ca-bundle\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489668 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-oauth-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-service-ca\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489761 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbbm\" (UniqueName: \"kubernetes.io/projected/e51ac31b-02a8-41a4-888f-69338002ab14-kube-api-access-9lbbm\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.489779 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnsqh\" (UniqueName: \"kubernetes.io/projected/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-kube-api-access-vnsqh\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.490801 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.498033 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.515023 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnsqh\" (UniqueName: \"kubernetes.io/projected/2fb869f8-f1cb-4f72-8ef8-d1969ea326aa-kube-api-access-vnsqh\") pod \"nmstate-console-plugin-86f58fcf4-h2wm7\" (UID: \"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.535060 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.591827 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-oauth-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.592250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-console-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.592379 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-trusted-ca-bundle\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593221 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-console-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593362 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593406 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-trusted-ca-bundle\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593423 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-oauth-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593471 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-service-ca\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.593568 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbbm\" (UniqueName: \"kubernetes.io/projected/e51ac31b-02a8-41a4-888f-69338002ab14-kube-api-access-9lbbm\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.594483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-service-ca\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.594833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51ac31b-02a8-41a4-888f-69338002ab14-oauth-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.598026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-oauth-config\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.598024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51ac31b-02a8-41a4-888f-69338002ab14-console-serving-cert\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.609521 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.609924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbbm\" (UniqueName: \"kubernetes.io/projected/e51ac31b-02a8-41a4-888f-69338002ab14-kube-api-access-9lbbm\") pod \"console-5fcc5cf8cf-zpjsj\" (UID: \"e51ac31b-02a8-41a4-888f-69338002ab14\") " pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.645924 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4"] Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.707005 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.759529 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.784092 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7"] Mar 20 13:35:43 crc kubenswrapper[4895]: W0320 13:35:43.826502 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb869f8_f1cb_4f72_8ef8_d1969ea326aa.slice/crio-82f5aa3163ed3d8aab417d41ea96d6dddcf5600bd9204ff8bc6bce88d6302880 WatchSource:0}: Error finding container 82f5aa3163ed3d8aab417d41ea96d6dddcf5600bd9204ff8bc6bce88d6302880: Status 404 returned error can't find the container with id 82f5aa3163ed3d8aab417d41ea96d6dddcf5600bd9204ff8bc6bce88d6302880 Mar 20 13:35:43 crc kubenswrapper[4895]: I0320 13:35:43.990804 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fcc5cf8cf-zpjsj"] Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.290763 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" event={"ID":"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa","Type":"ContainerStarted","Data":"82f5aa3163ed3d8aab417d41ea96d6dddcf5600bd9204ff8bc6bce88d6302880"} Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.292139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fmpgx" event={"ID":"0a5d87c8-dc19-4250-b749-8827eb2de72f","Type":"ContainerStarted","Data":"d765439d6920947c2fc36df2a7264d8b8b93ef2ac1a6a77a186acc7236c79563"} Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.294458 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcc5cf8cf-zpjsj" event={"ID":"e51ac31b-02a8-41a4-888f-69338002ab14","Type":"ContainerStarted","Data":"3c6362abf0229dfe6e3550c07b3f29e2f1b523e42e4f02b6906ce5a03ea79236"} Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.294545 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcc5cf8cf-zpjsj" event={"ID":"e51ac31b-02a8-41a4-888f-69338002ab14","Type":"ContainerStarted","Data":"3f00d538490d11695dee907ee22b4d2d196d5849318eec252497c6e9221812f8"} Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.295712 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" event={"ID":"44daea46-7374-461a-9926-c66cb642296d","Type":"ContainerStarted","Data":"a38bab1107e2eaa2f54ea836d2818cf08527f5f35f91b9c4e06fc5051234f191"} Mar 20 13:35:44 crc kubenswrapper[4895]: I0320 13:35:44.297028 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" event={"ID":"7d6c7d25-f568-4cde-9716-c5fa4b4747b7","Type":"ContainerStarted","Data":"66591ca8244ae9d52ad76124ce62036a5189c1e6b92ed31fcc8f496c0fc5f06a"} Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.346940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" event={"ID":"2fb869f8-f1cb-4f72-8ef8-d1969ea326aa","Type":"ContainerStarted","Data":"fcd5cec0d3d2c017f7d5cb1991dd13f7b23a2dbaebb122ef3ae43198837594a5"} Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.350305 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" event={"ID":"44daea46-7374-461a-9926-c66cb642296d","Type":"ContainerStarted","Data":"a9470add6ac7d0ae916838a52bcf44960e2d3cc9e5f1c9b83b33953b2660b529"} Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.383820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" event={"ID":"7d6c7d25-f568-4cde-9716-c5fa4b4747b7","Type":"ContainerStarted","Data":"0353ca191e92fcc71f0cb91eacd89177b52f18c57436eadde0188e8bc10812e9"} Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.384751 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.388569 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fcc5cf8cf-zpjsj" podStartSLOduration=6.385382841 podStartE2EDuration="6.385382841s" podCreationTimestamp="2026-03-20 13:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:45.349540861 +0000 UTC m=+844.859259897" watchObservedRunningTime="2026-03-20 13:35:49.385382841 +0000 UTC m=+848.895101817" Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.391154 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-h2wm7" podStartSLOduration=1.1616211889999999 podStartE2EDuration="6.391139804s" podCreationTimestamp="2026-03-20 13:35:43 +0000 UTC" firstStartedPulling="2026-03-20 13:35:43.832528725 +0000 UTC m=+843.342247691" lastFinishedPulling="2026-03-20 13:35:49.06204733 +0000 UTC m=+848.571766306" observedRunningTime="2026-03-20 13:35:49.381789211 +0000 UTC m=+848.891508197" watchObservedRunningTime="2026-03-20 13:35:49.391139804 +0000 UTC m=+848.900858780" Mar 20 13:35:49 crc kubenswrapper[4895]: I0320 13:35:49.435564 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" podStartSLOduration=1.984510814 podStartE2EDuration="7.435540198s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="2026-03-20 13:35:43.629769832 +0000 UTC m=+843.139488798" lastFinishedPulling="2026-03-20 13:35:49.080799206 +0000 UTC m=+848.590518182" observedRunningTime="2026-03-20 13:35:49.430906593 +0000 UTC m=+848.940625579" watchObservedRunningTime="2026-03-20 13:35:49.435540198 +0000 UTC m=+848.945259174" Mar 20 13:35:50 crc kubenswrapper[4895]: I0320 13:35:50.394596 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fmpgx" event={"ID":"0a5d87c8-dc19-4250-b749-8827eb2de72f","Type":"ContainerStarted","Data":"16ce3cf8eaa56543c3b7e42794aaceb1898ec95b8c99b850cde7be1e5cf7db7b"} Mar 20 13:35:50 crc kubenswrapper[4895]: I0320 13:35:50.394879 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:35:50 crc kubenswrapper[4895]: I0320 13:35:50.413184 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fmpgx" podStartSLOduration=2.105442271 podStartE2EDuration="7.413160681s" podCreationTimestamp="2026-03-20 13:35:43 +0000 UTC" firstStartedPulling="2026-03-20 13:35:43.75715743 +0000 UTC m=+843.266876396" lastFinishedPulling="2026-03-20 13:35:49.06487583 +0000 UTC m=+848.574594806" observedRunningTime="2026-03-20 13:35:50.410086544 +0000 UTC m=+849.919805520" watchObservedRunningTime="2026-03-20 13:35:50.413160681 +0000 UTC m=+849.922879647" Mar 20 13:35:52 crc kubenswrapper[4895]: I0320 13:35:52.412409 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" event={"ID":"44daea46-7374-461a-9926-c66cb642296d","Type":"ContainerStarted","Data":"71edd1c4b32f6fc3a5d307720708ff3b17af56a978f1ab784eaab555870fd456"} Mar 20 13:35:52 crc kubenswrapper[4895]: I0320 13:35:52.437304 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zrhs4" podStartSLOduration=1.983831747 podStartE2EDuration="10.437275869s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="2026-03-20 13:35:43.66065303 +0000 UTC m=+843.170371996" lastFinishedPulling="2026-03-20 13:35:52.114097152 +0000 UTC m=+851.623816118" observedRunningTime="2026-03-20 13:35:52.428178313 +0000 UTC m=+851.937897309" watchObservedRunningTime="2026-03-20 13:35:52.437275869 +0000 UTC m=+851.946994875" Mar 20 13:35:53 crc kubenswrapper[4895]: I0320 13:35:53.760328 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:53 crc kubenswrapper[4895]: I0320 13:35:53.760440 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:53 crc kubenswrapper[4895]: I0320 13:35:53.766160 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:54 crc kubenswrapper[4895]: I0320 13:35:54.430652 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fcc5cf8cf-zpjsj" Mar 20 13:35:54 crc kubenswrapper[4895]: I0320 13:35:54.493925 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:35:58 crc kubenswrapper[4895]: I0320 13:35:58.741810 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fmpgx" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.172218 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-85vf7"] Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.173924 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.176940 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.178434 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.179017 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-85vf7"] Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.183484 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.329889 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blwx\" (UniqueName: \"kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx\") pod \"auto-csr-approver-29566896-85vf7\" (UID: \"4616a449-db7a-40c3-9960-975e92a69030\") " pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.431008 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7blwx\" (UniqueName: \"kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx\") pod \"auto-csr-approver-29566896-85vf7\" (UID: \"4616a449-db7a-40c3-9960-975e92a69030\") " pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.467017 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blwx\" (UniqueName: \"kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx\") pod \"auto-csr-approver-29566896-85vf7\" (UID: \"4616a449-db7a-40c3-9960-975e92a69030\") " pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.506566 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:00 crc kubenswrapper[4895]: I0320 13:36:00.969083 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-85vf7"] Mar 20 13:36:00 crc kubenswrapper[4895]: W0320 13:36:00.971379 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4616a449_db7a_40c3_9960_975e92a69030.slice/crio-33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b WatchSource:0}: Error finding container 33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b: Status 404 returned error can't find the container with id 33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b Mar 20 13:36:01 crc kubenswrapper[4895]: I0320 13:36:01.482560 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-85vf7" event={"ID":"4616a449-db7a-40c3-9960-975e92a69030","Type":"ContainerStarted","Data":"33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b"} Mar 20 13:36:03 crc kubenswrapper[4895]: I0320 13:36:03.390722 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-xmmr9" Mar 20 13:36:03 crc kubenswrapper[4895]: I0320 13:36:03.499807 4895 generic.go:334] "Generic (PLEG): container finished" podID="4616a449-db7a-40c3-9960-975e92a69030" containerID="02d2d77999792227a994e2c682e8c062dbb0f08795c20d2ad015d9d89d1efec9" exitCode=0 Mar 20 13:36:03 crc kubenswrapper[4895]: I0320 13:36:03.499893 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-85vf7" event={"ID":"4616a449-db7a-40c3-9960-975e92a69030","Type":"ContainerDied","Data":"02d2d77999792227a994e2c682e8c062dbb0f08795c20d2ad015d9d89d1efec9"} Mar 20 13:36:04 crc kubenswrapper[4895]: I0320 13:36:04.775449 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:04 crc kubenswrapper[4895]: I0320 13:36:04.799480 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7blwx\" (UniqueName: \"kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx\") pod \"4616a449-db7a-40c3-9960-975e92a69030\" (UID: \"4616a449-db7a-40c3-9960-975e92a69030\") " Mar 20 13:36:04 crc kubenswrapper[4895]: I0320 13:36:04.850909 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx" (OuterVolumeSpecName: "kube-api-access-7blwx") pod "4616a449-db7a-40c3-9960-975e92a69030" (UID: "4616a449-db7a-40c3-9960-975e92a69030"). InnerVolumeSpecName "kube-api-access-7blwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:04 crc kubenswrapper[4895]: I0320 13:36:04.901649 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7blwx\" (UniqueName: \"kubernetes.io/projected/4616a449-db7a-40c3-9960-975e92a69030-kube-api-access-7blwx\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:05 crc kubenswrapper[4895]: I0320 13:36:05.516599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-85vf7" event={"ID":"4616a449-db7a-40c3-9960-975e92a69030","Type":"ContainerDied","Data":"33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b"} Mar 20 13:36:05 crc kubenswrapper[4895]: I0320 13:36:05.516644 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33cbcfda1f3ad179b1e0ad6e0d98a06cf5c9d294d878cb47a4d9fbe58a937e7b" Mar 20 13:36:05 crc kubenswrapper[4895]: I0320 13:36:05.516695 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-85vf7" Mar 20 13:36:05 crc kubenswrapper[4895]: I0320 13:36:05.835066 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-xm9vg"] Mar 20 13:36:05 crc kubenswrapper[4895]: I0320 13:36:05.850217 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-xm9vg"] Mar 20 13:36:07 crc kubenswrapper[4895]: I0320 13:36:07.221077 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecd2503-6e65-462b-ada9-4fb5ede84f14" path="/var/lib/kubelet/pods/6ecd2503-6e65-462b-ada9-4fb5ede84f14/volumes" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.392294 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc"] Mar 20 13:36:18 crc kubenswrapper[4895]: E0320 13:36:18.393040 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4616a449-db7a-40c3-9960-975e92a69030" containerName="oc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.393056 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4616a449-db7a-40c3-9960-975e92a69030" containerName="oc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.393189 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4616a449-db7a-40c3-9960-975e92a69030" containerName="oc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.394176 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.396009 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.402073 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc"] Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.489541 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.489788 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.489886 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5t2r\" (UniqueName: \"kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.590954 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.591207 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.591286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5t2r\" (UniqueName: \"kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.591689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.591986 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.619600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5t2r\" (UniqueName: \"kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.760863 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:18 crc kubenswrapper[4895]: I0320 13:36:18.956784 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc"] Mar 20 13:36:18 crc kubenswrapper[4895]: W0320 13:36:18.965581 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50bb0b3_ed45_4db8_b26d_33f0c8d3cfaf.slice/crio-c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d WatchSource:0}: Error finding container c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d: Status 404 returned error can't find the container with id c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.547879 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wrj6w" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerName="console" containerID="cri-o://b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64" gracePeriod=15 Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.613324 4895 generic.go:334] "Generic (PLEG): container finished" podID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerID="4b5d6ac6e75c50dd5fab36e71dae74e732c5d6e32d9e9f81d43d9c219e97acb2" exitCode=0 Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.613581 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" event={"ID":"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf","Type":"ContainerDied","Data":"4b5d6ac6e75c50dd5fab36e71dae74e732c5d6e32d9e9f81d43d9c219e97acb2"} Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.613610 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" event={"ID":"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf","Type":"ContainerStarted","Data":"c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d"} Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.911707 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wrj6w_565b4975-d16b-4ce5-8200-a0700d9e9d4c/console/0.log" Mar 20 13:36:19 crc kubenswrapper[4895]: I0320 13:36:19.912059 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008243 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008284 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9prgk\" (UniqueName: \"kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008318 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008341 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008362 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.008473 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert\") pod \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\" (UID: \"565b4975-d16b-4ce5-8200-a0700d9e9d4c\") " Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.009219 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.009234 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca" (OuterVolumeSpecName: "service-ca") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.009272 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.009317 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config" (OuterVolumeSpecName: "console-config") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.017510 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk" (OuterVolumeSpecName: "kube-api-access-9prgk") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "kube-api-access-9prgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.024502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.026860 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "565b4975-d16b-4ce5-8200-a0700d9e9d4c" (UID: "565b4975-d16b-4ce5-8200-a0700d9e9d4c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109826 4895 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109862 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109871 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9prgk\" (UniqueName: \"kubernetes.io/projected/565b4975-d16b-4ce5-8200-a0700d9e9d4c-kube-api-access-9prgk\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109883 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109891 4895 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109901 4895 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.109909 4895 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/565b4975-d16b-4ce5-8200-a0700d9e9d4c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623154 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wrj6w_565b4975-d16b-4ce5-8200-a0700d9e9d4c/console/0.log" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623218 4895 generic.go:334] "Generic (PLEG): container finished" podID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerID="b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64" exitCode=2 Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623252 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wrj6w" event={"ID":"565b4975-d16b-4ce5-8200-a0700d9e9d4c","Type":"ContainerDied","Data":"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64"} Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wrj6w" event={"ID":"565b4975-d16b-4ce5-8200-a0700d9e9d4c","Type":"ContainerDied","Data":"a9b4c69b971850e86543eb0a945f84911d166aa80d1c6a477d900246135a912b"} Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623304 4895 scope.go:117] "RemoveContainer" containerID="b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.623342 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wrj6w" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.651970 4895 scope.go:117] "RemoveContainer" containerID="b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64" Mar 20 13:36:20 crc kubenswrapper[4895]: E0320 13:36:20.652794 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64\": container with ID starting with b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64 not found: ID does not exist" containerID="b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.652864 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64"} err="failed to get container status \"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64\": rpc error: code = NotFound desc = could not find container \"b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64\": container with ID starting with b14993517ca7a489e11fe169d7cdc7f36885bcbb658680e966b3113c5891ed64 not found: ID does not exist" Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.678600 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:36:20 crc kubenswrapper[4895]: I0320 13:36:20.688702 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wrj6w"] Mar 20 13:36:21 crc kubenswrapper[4895]: I0320 13:36:21.224267 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" path="/var/lib/kubelet/pods/565b4975-d16b-4ce5-8200-a0700d9e9d4c/volumes" Mar 20 13:36:21 crc kubenswrapper[4895]: I0320 13:36:21.638634 4895 generic.go:334] "Generic (PLEG): container finished" podID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerID="0039af9ba16df605fd17b09ae4f78c6907e7fbe20739e1777ea9212df2d0292b" exitCode=0 Mar 20 13:36:21 crc kubenswrapper[4895]: I0320 13:36:21.638735 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" event={"ID":"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf","Type":"ContainerDied","Data":"0039af9ba16df605fd17b09ae4f78c6907e7fbe20739e1777ea9212df2d0292b"} Mar 20 13:36:22 crc kubenswrapper[4895]: I0320 13:36:22.654610 4895 generic.go:334] "Generic (PLEG): container finished" podID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerID="90b65132ccb5bc2704052c2d045f7d39cc851d901c315624dfc4579ee8441e16" exitCode=0 Mar 20 13:36:22 crc kubenswrapper[4895]: I0320 13:36:22.655033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" event={"ID":"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf","Type":"ContainerDied","Data":"90b65132ccb5bc2704052c2d045f7d39cc851d901c315624dfc4579ee8441e16"} Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.024731 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.161979 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5t2r\" (UniqueName: \"kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r\") pod \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.162062 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util\") pod \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.162225 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle\") pod \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\" (UID: \"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf\") " Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.164018 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle" (OuterVolumeSpecName: "bundle") pod "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" (UID: "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.170461 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r" (OuterVolumeSpecName: "kube-api-access-l5t2r") pod "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" (UID: "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf"). InnerVolumeSpecName "kube-api-access-l5t2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.186464 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util" (OuterVolumeSpecName: "util") pod "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" (UID: "b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.263840 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.263871 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5t2r\" (UniqueName: \"kubernetes.io/projected/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-kube-api-access-l5t2r\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.263885 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.673230 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" event={"ID":"b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf","Type":"ContainerDied","Data":"c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d"} Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.673951 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c39f986ae2e68605620b2f388bf3ffd8fd2449d189a960921251e8ca30dc5d4d" Mar 20 13:36:24 crc kubenswrapper[4895]: I0320 13:36:24.673287 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.158205 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x"] Mar 20 13:36:34 crc kubenswrapper[4895]: E0320 13:36:34.158924 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="pull" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.158935 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="pull" Mar 20 13:36:34 crc kubenswrapper[4895]: E0320 13:36:34.158946 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerName="console" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.158952 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerName="console" Mar 20 13:36:34 crc kubenswrapper[4895]: E0320 13:36:34.158964 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="util" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.158970 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="util" Mar 20 13:36:34 crc kubenswrapper[4895]: E0320 13:36:34.158979 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="extract" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.158986 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="extract" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.159085 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf" containerName="extract" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.159108 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="565b4975-d16b-4ce5-8200-a0700d9e9d4c" containerName="console" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.159534 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.160772 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.161024 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.161375 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zghs9" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.161670 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.172148 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.191542 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x"] Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.200800 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-webhook-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.200839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8dpm\" (UniqueName: \"kubernetes.io/projected/434eee1f-2ee9-41fd-b97a-9f995142dcfc-kube-api-access-q8dpm\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.200909 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-apiservice-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.302279 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-apiservice-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.302400 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8dpm\" (UniqueName: \"kubernetes.io/projected/434eee1f-2ee9-41fd-b97a-9f995142dcfc-kube-api-access-q8dpm\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.302422 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-webhook-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.307629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-webhook-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.320125 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/434eee1f-2ee9-41fd-b97a-9f995142dcfc-apiservice-cert\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.328851 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8dpm\" (UniqueName: \"kubernetes.io/projected/434eee1f-2ee9-41fd-b97a-9f995142dcfc-kube-api-access-q8dpm\") pod \"metallb-operator-controller-manager-7864bd6b9f-n862x\" (UID: \"434eee1f-2ee9-41fd-b97a-9f995142dcfc\") " pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.497084 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.630527 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr"] Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.631465 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.634353 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.634560 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-z7xfl" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.634682 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.642763 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr"] Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.710205 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-apiservice-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.710278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-webhook-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.710403 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4wj\" (UniqueName: \"kubernetes.io/projected/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-kube-api-access-hs4wj\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.811648 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4wj\" (UniqueName: \"kubernetes.io/projected/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-kube-api-access-hs4wj\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.812057 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-apiservice-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.812104 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-webhook-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.817109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-webhook-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.826926 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-apiservice-cert\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.830863 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4wj\" (UniqueName: \"kubernetes.io/projected/a9b00cd0-442e-401d-b062-4a9bd5ae35f9-kube-api-access-hs4wj\") pod \"metallb-operator-webhook-server-67445d8464-hmtkr\" (UID: \"a9b00cd0-442e-401d-b062-4a9bd5ae35f9\") " pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.962848 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:34 crc kubenswrapper[4895]: I0320 13:36:34.980046 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x"] Mar 20 13:36:35 crc kubenswrapper[4895]: I0320 13:36:35.377531 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr"] Mar 20 13:36:35 crc kubenswrapper[4895]: W0320 13:36:35.390516 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b00cd0_442e_401d_b062_4a9bd5ae35f9.slice/crio-d7611c4003644954995ac4795c3505c81445ea73fdc38d41be54f98a70d6372b WatchSource:0}: Error finding container d7611c4003644954995ac4795c3505c81445ea73fdc38d41be54f98a70d6372b: Status 404 returned error can't find the container with id d7611c4003644954995ac4795c3505c81445ea73fdc38d41be54f98a70d6372b Mar 20 13:36:35 crc kubenswrapper[4895]: I0320 13:36:35.742092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" event={"ID":"434eee1f-2ee9-41fd-b97a-9f995142dcfc","Type":"ContainerStarted","Data":"186c14630bee4ed1358d483704f81974453019e510137ee161081a793991d3da"} Mar 20 13:36:35 crc kubenswrapper[4895]: I0320 13:36:35.743641 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" event={"ID":"a9b00cd0-442e-401d-b062-4a9bd5ae35f9","Type":"ContainerStarted","Data":"d7611c4003644954995ac4795c3505c81445ea73fdc38d41be54f98a70d6372b"} Mar 20 13:36:38 crc kubenswrapper[4895]: I0320 13:36:38.773574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" event={"ID":"434eee1f-2ee9-41fd-b97a-9f995142dcfc","Type":"ContainerStarted","Data":"3428a5429b2e2a67d939935ace7b1b66d5e8f8dab0701466a3b797aece05ba77"} Mar 20 13:36:38 crc kubenswrapper[4895]: I0320 13:36:38.773866 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:36:38 crc kubenswrapper[4895]: I0320 13:36:38.802927 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" podStartSLOduration=1.317392069 podStartE2EDuration="4.802911852s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:34.988296735 +0000 UTC m=+894.498015701" lastFinishedPulling="2026-03-20 13:36:38.473816518 +0000 UTC m=+897.983535484" observedRunningTime="2026-03-20 13:36:38.801551758 +0000 UTC m=+898.311270724" watchObservedRunningTime="2026-03-20 13:36:38.802911852 +0000 UTC m=+898.312630818" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.112531 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.113578 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.121345 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.168454 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzdx\" (UniqueName: \"kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.168499 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.168565 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.270178 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.270332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzdx\" (UniqueName: \"kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.270382 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.271263 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.272987 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.293104 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzdx\" (UniqueName: \"kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx\") pod \"certified-operators-lr25w\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:39 crc kubenswrapper[4895]: I0320 13:36:39.446031 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.321885 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.334170 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.343627 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.392066 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.410623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.410666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzggs\" (UniqueName: \"kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.410897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.511959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.512019 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.512046 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzggs\" (UniqueName: \"kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.512512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.512650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.533449 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzggs\" (UniqueName: \"kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs\") pod \"redhat-marketplace-69kvl\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.691374 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.794293 4895 generic.go:334] "Generic (PLEG): container finished" podID="2289befe-acb8-438b-bc91-512831cac5f5" containerID="b26e0aa192631a1691f567bd1b40bc8e8ecdc74c8a658039ba4d6d9fe3946f9b" exitCode=0 Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.794708 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerDied","Data":"b26e0aa192631a1691f567bd1b40bc8e8ecdc74c8a658039ba4d6d9fe3946f9b"} Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.794749 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerStarted","Data":"2f5523999d9bb86c970ae619313297123420b0db31e6241c221667a396f62ad5"} Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.799822 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" event={"ID":"a9b00cd0-442e-401d-b062-4a9bd5ae35f9","Type":"ContainerStarted","Data":"db78422dafcef1e83ca9c9520b4e508eab77002d0045238a398ecc4dccdde29d"} Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.799943 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:41 crc kubenswrapper[4895]: I0320 13:36:41.844899 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" podStartSLOduration=2.0692516579999998 podStartE2EDuration="7.844877605s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.392338623 +0000 UTC m=+894.902057589" lastFinishedPulling="2026-03-20 13:36:41.16796457 +0000 UTC m=+900.677683536" observedRunningTime="2026-03-20 13:36:41.840950677 +0000 UTC m=+901.350669683" watchObservedRunningTime="2026-03-20 13:36:41.844877605 +0000 UTC m=+901.354596571" Mar 20 13:36:42 crc kubenswrapper[4895]: I0320 13:36:42.111089 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:42 crc kubenswrapper[4895]: W0320 13:36:42.118341 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9f3edf_cfe7_47af_a17b_5d6b662849a2.slice/crio-42f7410466c07e34ae8cfe046cf9880560a24f1bc30a7f7f3eb1567e4b8bf6b6 WatchSource:0}: Error finding container 42f7410466c07e34ae8cfe046cf9880560a24f1bc30a7f7f3eb1567e4b8bf6b6: Status 404 returned error can't find the container with id 42f7410466c07e34ae8cfe046cf9880560a24f1bc30a7f7f3eb1567e4b8bf6b6 Mar 20 13:36:42 crc kubenswrapper[4895]: I0320 13:36:42.808545 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerStarted","Data":"fb83a13ededcd6669340385c508a922e618f661dbca17915d5cd5abd4ac574f4"} Mar 20 13:36:42 crc kubenswrapper[4895]: I0320 13:36:42.811047 4895 generic.go:334] "Generic (PLEG): container finished" podID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerID="ed2edd5109612eacad743dca0801fdf9f63ab52e11f93c9dc00c849388775906" exitCode=0 Mar 20 13:36:42 crc kubenswrapper[4895]: I0320 13:36:42.811114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerDied","Data":"ed2edd5109612eacad743dca0801fdf9f63ab52e11f93c9dc00c849388775906"} Mar 20 13:36:42 crc kubenswrapper[4895]: I0320 13:36:42.811185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerStarted","Data":"42f7410466c07e34ae8cfe046cf9880560a24f1bc30a7f7f3eb1567e4b8bf6b6"} Mar 20 13:36:43 crc kubenswrapper[4895]: I0320 13:36:43.208722 4895 scope.go:117] "RemoveContainer" containerID="e1def2f44dc457d1e9e6201a77041537be4ee6287e9b19c32ce6378f7cecf2ab" Mar 20 13:36:43 crc kubenswrapper[4895]: I0320 13:36:43.825378 4895 generic.go:334] "Generic (PLEG): container finished" podID="2289befe-acb8-438b-bc91-512831cac5f5" containerID="fb83a13ededcd6669340385c508a922e618f661dbca17915d5cd5abd4ac574f4" exitCode=0 Mar 20 13:36:43 crc kubenswrapper[4895]: I0320 13:36:43.825514 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerDied","Data":"fb83a13ededcd6669340385c508a922e618f661dbca17915d5cd5abd4ac574f4"} Mar 20 13:36:44 crc kubenswrapper[4895]: I0320 13:36:44.839257 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerStarted","Data":"a2e4a53d88076221534416f49f4be5fe315bcb7ac40764d5b88b3f0edeb350bf"} Mar 20 13:36:44 crc kubenswrapper[4895]: I0320 13:36:44.842328 4895 generic.go:334] "Generic (PLEG): container finished" podID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerID="1be0aefa6914bc698972cc03b2a0cbd98734ce8ec493501db23d4b6b13cf39af" exitCode=0 Mar 20 13:36:44 crc kubenswrapper[4895]: I0320 13:36:44.842367 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerDied","Data":"1be0aefa6914bc698972cc03b2a0cbd98734ce8ec493501db23d4b6b13cf39af"} Mar 20 13:36:44 crc kubenswrapper[4895]: I0320 13:36:44.875332 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lr25w" podStartSLOduration=3.293189863 podStartE2EDuration="5.875309449s" podCreationTimestamp="2026-03-20 13:36:39 +0000 UTC" firstStartedPulling="2026-03-20 13:36:41.798661325 +0000 UTC m=+901.308380281" lastFinishedPulling="2026-03-20 13:36:44.380780891 +0000 UTC m=+903.890499867" observedRunningTime="2026-03-20 13:36:44.868552311 +0000 UTC m=+904.378271287" watchObservedRunningTime="2026-03-20 13:36:44.875309449 +0000 UTC m=+904.385028425" Mar 20 13:36:45 crc kubenswrapper[4895]: I0320 13:36:45.849803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerStarted","Data":"0a1ba805dfe7f1fc53f5d0c087c665bb9616f4466582d12b70722b3edeebaa20"} Mar 20 13:36:45 crc kubenswrapper[4895]: I0320 13:36:45.876345 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-69kvl" podStartSLOduration=2.385004337 podStartE2EDuration="4.876329274s" podCreationTimestamp="2026-03-20 13:36:41 +0000 UTC" firstStartedPulling="2026-03-20 13:36:42.81213593 +0000 UTC m=+902.321854896" lastFinishedPulling="2026-03-20 13:36:45.303460867 +0000 UTC m=+904.813179833" observedRunningTime="2026-03-20 13:36:45.872288113 +0000 UTC m=+905.382007079" watchObservedRunningTime="2026-03-20 13:36:45.876329274 +0000 UTC m=+905.386048240" Mar 20 13:36:49 crc kubenswrapper[4895]: I0320 13:36:49.446666 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:49 crc kubenswrapper[4895]: I0320 13:36:49.447008 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:49 crc kubenswrapper[4895]: I0320 13:36:49.508788 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:49 crc kubenswrapper[4895]: I0320 13:36:49.911709 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:51 crc kubenswrapper[4895]: I0320 13:36:51.692287 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:51 crc kubenswrapper[4895]: I0320 13:36:51.692352 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:51 crc kubenswrapper[4895]: I0320 13:36:51.741115 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:51 crc kubenswrapper[4895]: I0320 13:36:51.932652 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:52 crc kubenswrapper[4895]: I0320 13:36:52.705920 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:52 crc kubenswrapper[4895]: I0320 13:36:52.706501 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lr25w" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="registry-server" containerID="cri-o://a2e4a53d88076221534416f49f4be5fe315bcb7ac40764d5b88b3f0edeb350bf" gracePeriod=2 Mar 20 13:36:53 crc kubenswrapper[4895]: I0320 13:36:53.897808 4895 generic.go:334] "Generic (PLEG): container finished" podID="2289befe-acb8-438b-bc91-512831cac5f5" containerID="a2e4a53d88076221534416f49f4be5fe315bcb7ac40764d5b88b3f0edeb350bf" exitCode=0 Mar 20 13:36:53 crc kubenswrapper[4895]: I0320 13:36:53.898088 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerDied","Data":"a2e4a53d88076221534416f49f4be5fe315bcb7ac40764d5b88b3f0edeb350bf"} Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.373022 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.490680 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nzdx\" (UniqueName: \"kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx\") pod \"2289befe-acb8-438b-bc91-512831cac5f5\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.490727 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities\") pod \"2289befe-acb8-438b-bc91-512831cac5f5\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.490852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content\") pod \"2289befe-acb8-438b-bc91-512831cac5f5\" (UID: \"2289befe-acb8-438b-bc91-512831cac5f5\") " Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.491760 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities" (OuterVolumeSpecName: "utilities") pod "2289befe-acb8-438b-bc91-512831cac5f5" (UID: "2289befe-acb8-438b-bc91-512831cac5f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.495742 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx" (OuterVolumeSpecName: "kube-api-access-8nzdx") pod "2289befe-acb8-438b-bc91-512831cac5f5" (UID: "2289befe-acb8-438b-bc91-512831cac5f5"). InnerVolumeSpecName "kube-api-access-8nzdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.542467 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2289befe-acb8-438b-bc91-512831cac5f5" (UID: "2289befe-acb8-438b-bc91-512831cac5f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.591866 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.591899 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nzdx\" (UniqueName: \"kubernetes.io/projected/2289befe-acb8-438b-bc91-512831cac5f5-kube-api-access-8nzdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.591911 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2289befe-acb8-438b-bc91-512831cac5f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.905754 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lr25w" event={"ID":"2289befe-acb8-438b-bc91-512831cac5f5","Type":"ContainerDied","Data":"2f5523999d9bb86c970ae619313297123420b0db31e6241c221667a396f62ad5"} Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.906381 4895 scope.go:117] "RemoveContainer" containerID="a2e4a53d88076221534416f49f4be5fe315bcb7ac40764d5b88b3f0edeb350bf" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.905803 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lr25w" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.923787 4895 scope.go:117] "RemoveContainer" containerID="fb83a13ededcd6669340385c508a922e618f661dbca17915d5cd5abd4ac574f4" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.933468 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.938355 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lr25w"] Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.959637 4895 scope.go:117] "RemoveContainer" containerID="b26e0aa192631a1691f567bd1b40bc8e8ecdc74c8a658039ba4d6d9fe3946f9b" Mar 20 13:36:54 crc kubenswrapper[4895]: I0320 13:36:54.966737 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-67445d8464-hmtkr" Mar 20 13:36:55 crc kubenswrapper[4895]: I0320 13:36:55.220187 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2289befe-acb8-438b-bc91-512831cac5f5" path="/var/lib/kubelet/pods/2289befe-acb8-438b-bc91-512831cac5f5/volumes" Mar 20 13:36:55 crc kubenswrapper[4895]: I0320 13:36:55.305726 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:55 crc kubenswrapper[4895]: I0320 13:36:55.306186 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-69kvl" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="registry-server" containerID="cri-o://0a1ba805dfe7f1fc53f5d0c087c665bb9616f4466582d12b70722b3edeebaa20" gracePeriod=2 Mar 20 13:36:55 crc kubenswrapper[4895]: I0320 13:36:55.915320 4895 generic.go:334] "Generic (PLEG): container finished" podID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerID="0a1ba805dfe7f1fc53f5d0c087c665bb9616f4466582d12b70722b3edeebaa20" exitCode=0 Mar 20 13:36:55 crc kubenswrapper[4895]: I0320 13:36:55.915358 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerDied","Data":"0a1ba805dfe7f1fc53f5d0c087c665bb9616f4466582d12b70722b3edeebaa20"} Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.153730 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.312819 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities\") pod \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.312896 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzggs\" (UniqueName: \"kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs\") pod \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.313025 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content\") pod \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\" (UID: \"fa9f3edf-cfe7-47af-a17b-5d6b662849a2\") " Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.314424 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities" (OuterVolumeSpecName: "utilities") pod "fa9f3edf-cfe7-47af-a17b-5d6b662849a2" (UID: "fa9f3edf-cfe7-47af-a17b-5d6b662849a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.326573 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs" (OuterVolumeSpecName: "kube-api-access-hzggs") pod "fa9f3edf-cfe7-47af-a17b-5d6b662849a2" (UID: "fa9f3edf-cfe7-47af-a17b-5d6b662849a2"). InnerVolumeSpecName "kube-api-access-hzggs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.340951 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa9f3edf-cfe7-47af-a17b-5d6b662849a2" (UID: "fa9f3edf-cfe7-47af-a17b-5d6b662849a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.414436 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzggs\" (UniqueName: \"kubernetes.io/projected/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-kube-api-access-hzggs\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.414496 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.414516 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9f3edf-cfe7-47af-a17b-5d6b662849a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.922948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-69kvl" event={"ID":"fa9f3edf-cfe7-47af-a17b-5d6b662849a2","Type":"ContainerDied","Data":"42f7410466c07e34ae8cfe046cf9880560a24f1bc30a7f7f3eb1567e4b8bf6b6"} Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.923001 4895 scope.go:117] "RemoveContainer" containerID="0a1ba805dfe7f1fc53f5d0c087c665bb9616f4466582d12b70722b3edeebaa20" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.923103 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-69kvl" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.944573 4895 scope.go:117] "RemoveContainer" containerID="1be0aefa6914bc698972cc03b2a0cbd98734ce8ec493501db23d4b6b13cf39af" Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.967238 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.976936 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-69kvl"] Mar 20 13:36:56 crc kubenswrapper[4895]: I0320 13:36:56.990137 4895 scope.go:117] "RemoveContainer" containerID="ed2edd5109612eacad743dca0801fdf9f63ab52e11f93c9dc00c849388775906" Mar 20 13:36:57 crc kubenswrapper[4895]: I0320 13:36:57.220147 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" path="/var/lib/kubelet/pods/fa9f3edf-cfe7-47af-a17b-5d6b662849a2/volumes" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.221335 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6wcbv"] Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222194 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="extract-content" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222213 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="extract-content" Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222231 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="extract-utilities" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222243 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="extract-utilities" Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222256 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="extract-utilities" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222265 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="extract-utilities" Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222279 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222288 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222303 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="extract-content" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222312 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="extract-content" Mar 20 13:37:07 crc kubenswrapper[4895]: E0320 13:37:07.222329 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222338 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222545 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9f3edf-cfe7-47af-a17b-5d6b662849a2" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.222560 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2289befe-acb8-438b-bc91-512831cac5f5" containerName="registry-server" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.223761 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.224914 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wcbv"] Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.287326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpvm\" (UniqueName: \"kubernetes.io/projected/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-kube-api-access-jfpvm\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.287658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-catalog-content\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.287806 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-utilities\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.389504 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-catalog-content\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.389788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-utilities\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.389876 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpvm\" (UniqueName: \"kubernetes.io/projected/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-kube-api-access-jfpvm\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.390349 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-utilities\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.390507 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-catalog-content\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.413248 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpvm\" (UniqueName: \"kubernetes.io/projected/045dcbb1-ad32-451d-ad2d-1f2c243bb0ee-kube-api-access-jfpvm\") pod \"community-operators-6wcbv\" (UID: \"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee\") " pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.544583 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.812061 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wcbv"] Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.998071 4895 generic.go:334] "Generic (PLEG): container finished" podID="045dcbb1-ad32-451d-ad2d-1f2c243bb0ee" containerID="9efd789a4f06bf91457fb52a1d22e2891166455d78bc5823f467edf78f4f868f" exitCode=0 Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.998110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wcbv" event={"ID":"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee","Type":"ContainerDied","Data":"9efd789a4f06bf91457fb52a1d22e2891166455d78bc5823f467edf78f4f868f"} Mar 20 13:37:07 crc kubenswrapper[4895]: I0320 13:37:07.998135 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wcbv" event={"ID":"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee","Type":"ContainerStarted","Data":"114c19c88db4e54f9a63a1d4e3a6c78afd6eed488719e94b56fb05fbf7fafc63"} Mar 20 13:37:13 crc kubenswrapper[4895]: I0320 13:37:13.030846 4895 generic.go:334] "Generic (PLEG): container finished" podID="045dcbb1-ad32-451d-ad2d-1f2c243bb0ee" containerID="45e11e19631c7787e6d63a706acfc74342436afe3cd6ca4eda13271b724b55f6" exitCode=0 Mar 20 13:37:13 crc kubenswrapper[4895]: I0320 13:37:13.030938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wcbv" event={"ID":"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee","Type":"ContainerDied","Data":"45e11e19631c7787e6d63a706acfc74342436afe3cd6ca4eda13271b724b55f6"} Mar 20 13:37:14 crc kubenswrapper[4895]: I0320 13:37:14.038829 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wcbv" event={"ID":"045dcbb1-ad32-451d-ad2d-1f2c243bb0ee","Type":"ContainerStarted","Data":"902a59cbf95399e6231674f3be749a653a71c2eb0e9136348baceba9e4561821"} Mar 20 13:37:14 crc kubenswrapper[4895]: I0320 13:37:14.059300 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6wcbv" podStartSLOduration=1.4303746290000001 podStartE2EDuration="7.059279387s" podCreationTimestamp="2026-03-20 13:37:07 +0000 UTC" firstStartedPulling="2026-03-20 13:37:07.999198351 +0000 UTC m=+927.508917317" lastFinishedPulling="2026-03-20 13:37:13.628103089 +0000 UTC m=+933.137822075" observedRunningTime="2026-03-20 13:37:14.053148564 +0000 UTC m=+933.562867540" watchObservedRunningTime="2026-03-20 13:37:14.059279387 +0000 UTC m=+933.568998353" Mar 20 13:37:14 crc kubenswrapper[4895]: I0320 13:37:14.500924 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7864bd6b9f-n862x" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.197838 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.198545 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.201548 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xrzm5" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.204325 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.210161 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bg9zg"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.214284 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.234499 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.234813 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.254927 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.304739 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-fm5cg"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.306876 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.308785 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.309577 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.310296 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9687b" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.313966 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314831 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-conf\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314872 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314925 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-sockets\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314943 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-metrics\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314965 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsj6x\" (UniqueName: \"kubernetes.io/projected/9f90de29-5eaa-4d44-8988-4623460dc401-kube-api-access-lsj6x\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.314987 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d956767-71fa-4b4f-b113-659286aa149c-metrics-certs\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.315004 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d956767-71fa-4b4f-b113-659286aa149c-frr-startup\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.315027 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gv4\" (UniqueName: \"kubernetes.io/projected/0d956767-71fa-4b4f-b113-659286aa149c-kube-api-access-k9gv4\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.315048 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-reloader\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.332465 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-rfwwx"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.333781 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.338904 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.346146 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-rfwwx"] Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417583 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417655 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-conf\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417713 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-cert\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417736 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417771 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metallb-excludel2\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417799 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-sockets\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-metrics\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsj6x\" (UniqueName: \"kubernetes.io/projected/9f90de29-5eaa-4d44-8988-4623460dc401-kube-api-access-lsj6x\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d956767-71fa-4b4f-b113-659286aa149c-metrics-certs\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d956767-71fa-4b4f-b113-659286aa149c-frr-startup\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417938 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s78kx\" (UniqueName: \"kubernetes.io/projected/bd9dac02-de3d-43fb-9046-5280d8131d3b-kube-api-access-s78kx\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417963 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gv4\" (UniqueName: \"kubernetes.io/projected/0d956767-71fa-4b4f-b113-659286aa149c-kube-api-access-k9gv4\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.417986 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-reloader\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.418040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.418065 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqpq\" (UniqueName: \"kubernetes.io/projected/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-kube-api-access-nsqpq\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.418220 4895 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.418272 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert podName:9f90de29-5eaa-4d44-8988-4623460dc401 nodeName:}" failed. No retries permitted until 2026-03-20 13:37:15.918253086 +0000 UTC m=+935.427972062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert") pod "frr-k8s-webhook-server-bcc4b6f68-5g6cb" (UID: "9f90de29-5eaa-4d44-8988-4623460dc401") : secret "frr-k8s-webhook-server-cert" not found Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.418713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-conf\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.418975 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-frr-sockets\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.419226 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-metrics\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.420648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0d956767-71fa-4b4f-b113-659286aa149c-reloader\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.421553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0d956767-71fa-4b4f-b113-659286aa149c-frr-startup\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.432176 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d956767-71fa-4b4f-b113-659286aa149c-metrics-certs\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.434304 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gv4\" (UniqueName: \"kubernetes.io/projected/0d956767-71fa-4b4f-b113-659286aa149c-kube-api-access-k9gv4\") pod \"frr-k8s-bg9zg\" (UID: \"0d956767-71fa-4b4f-b113-659286aa149c\") " pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.439061 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsj6x\" (UniqueName: \"kubernetes.io/projected/9f90de29-5eaa-4d44-8988-4623460dc401-kube-api-access-lsj6x\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519380 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metallb-excludel2\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519454 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s78kx\" (UniqueName: \"kubernetes.io/projected/bd9dac02-de3d-43fb-9046-5280d8131d3b-kube-api-access-s78kx\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519481 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqpq\" (UniqueName: \"kubernetes.io/projected/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-kube-api-access-nsqpq\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519530 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519575 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-cert\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.519615 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519685 4895 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519771 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist podName:1b17d41e-b2b9-471a-9b0a-1b12899ba46b nodeName:}" failed. No retries permitted until 2026-03-20 13:37:16.0197509 +0000 UTC m=+935.529469876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist") pod "speaker-fm5cg" (UID: "1b17d41e-b2b9-471a-9b0a-1b12899ba46b") : secret "metallb-memberlist" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519804 4895 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519841 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs podName:bd9dac02-de3d-43fb-9046-5280d8131d3b nodeName:}" failed. No retries permitted until 2026-03-20 13:37:16.019829242 +0000 UTC m=+935.529548208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs") pod "controller-7bb4cc7c98-rfwwx" (UID: "bd9dac02-de3d-43fb-9046-5280d8131d3b") : secret "controller-certs-secret" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519703 4895 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 13:37:15 crc kubenswrapper[4895]: E0320 13:37:15.519923 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs podName:1b17d41e-b2b9-471a-9b0a-1b12899ba46b nodeName:}" failed. No retries permitted until 2026-03-20 13:37:16.019912464 +0000 UTC m=+935.529631550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs") pod "speaker-fm5cg" (UID: "1b17d41e-b2b9-471a-9b0a-1b12899ba46b") : secret "speaker-certs-secret" not found Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.520231 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metallb-excludel2\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.523710 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.535269 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-cert\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.535936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s78kx\" (UniqueName: \"kubernetes.io/projected/bd9dac02-de3d-43fb-9046-5280d8131d3b-kube-api-access-s78kx\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.545029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqpq\" (UniqueName: \"kubernetes.io/projected/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-kube-api-access-nsqpq\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.558817 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.924939 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:15 crc kubenswrapper[4895]: I0320 13:37:15.928621 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f90de29-5eaa-4d44-8988-4623460dc401-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5g6cb\" (UID: \"9f90de29-5eaa-4d44-8988-4623460dc401\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.026859 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.026923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.026964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:16 crc kubenswrapper[4895]: E0320 13:37:16.027046 4895 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:37:16 crc kubenswrapper[4895]: E0320 13:37:16.027120 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist podName:1b17d41e-b2b9-471a-9b0a-1b12899ba46b nodeName:}" failed. No retries permitted until 2026-03-20 13:37:17.02709878 +0000 UTC m=+936.536817746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist") pod "speaker-fm5cg" (UID: "1b17d41e-b2b9-471a-9b0a-1b12899ba46b") : secret "metallb-memberlist" not found Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.030100 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-metrics-certs\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.031053 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd9dac02-de3d-43fb-9046-5280d8131d3b-metrics-certs\") pod \"controller-7bb4cc7c98-rfwwx\" (UID: \"bd9dac02-de3d-43fb-9046-5280d8131d3b\") " pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.050223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"06c8c0d0b055aebc14e5d3add111232fd0b2355cc86e8a9f8c098264691f4740"} Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.114646 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.263884 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.353827 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb"] Mar 20 13:37:16 crc kubenswrapper[4895]: I0320 13:37:16.696758 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-rfwwx"] Mar 20 13:37:16 crc kubenswrapper[4895]: W0320 13:37:16.705215 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd9dac02_de3d_43fb_9046_5280d8131d3b.slice/crio-50ebd0a9b08be3adbd55cd6c89fbe560a645dc45474b8b5aa3788dd9b0a4067e WatchSource:0}: Error finding container 50ebd0a9b08be3adbd55cd6c89fbe560a645dc45474b8b5aa3788dd9b0a4067e: Status 404 returned error can't find the container with id 50ebd0a9b08be3adbd55cd6c89fbe560a645dc45474b8b5aa3788dd9b0a4067e Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.043343 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.048956 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b17d41e-b2b9-471a-9b0a-1b12899ba46b-memberlist\") pod \"speaker-fm5cg\" (UID: \"1b17d41e-b2b9-471a-9b0a-1b12899ba46b\") " pod="metallb-system/speaker-fm5cg" Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.060012 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rfwwx" event={"ID":"bd9dac02-de3d-43fb-9046-5280d8131d3b","Type":"ContainerStarted","Data":"b11d91dee09cbdcd3354390715e78417024a842298b9a8c160933f2a97fc8a5f"} Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.060057 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rfwwx" event={"ID":"bd9dac02-de3d-43fb-9046-5280d8131d3b","Type":"ContainerStarted","Data":"50ebd0a9b08be3adbd55cd6c89fbe560a645dc45474b8b5aa3788dd9b0a4067e"} Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.062027 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" event={"ID":"9f90de29-5eaa-4d44-8988-4623460dc401","Type":"ContainerStarted","Data":"8d73ae5f8a069996667debbd761252d5f5b7d2f3836f7613168766ea5153bedb"} Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.125315 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fm5cg" Mar 20 13:37:17 crc kubenswrapper[4895]: W0320 13:37:17.152126 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b17d41e_b2b9_471a_9b0a_1b12899ba46b.slice/crio-98aba7f456bd800f7e4c99605a0ce115069c7484c15dc3f1d32fc95bf658995f WatchSource:0}: Error finding container 98aba7f456bd800f7e4c99605a0ce115069c7484c15dc3f1d32fc95bf658995f: Status 404 returned error can't find the container with id 98aba7f456bd800f7e4c99605a0ce115069c7484c15dc3f1d32fc95bf658995f Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.545181 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.545544 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:17 crc kubenswrapper[4895]: I0320 13:37:17.598021 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.068742 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fm5cg" event={"ID":"1b17d41e-b2b9-471a-9b0a-1b12899ba46b","Type":"ContainerStarted","Data":"5d5b9caa281c2726155c2edc577f9db9e52f4ae533df6588913deb75df89fa94"} Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.068785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fm5cg" event={"ID":"1b17d41e-b2b9-471a-9b0a-1b12899ba46b","Type":"ContainerStarted","Data":"0f213e109165fea2ec67e366364a7c42b7d7f6158e9960acc77bafa47eda385c"} Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.068796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fm5cg" event={"ID":"1b17d41e-b2b9-471a-9b0a-1b12899ba46b","Type":"ContainerStarted","Data":"98aba7f456bd800f7e4c99605a0ce115069c7484c15dc3f1d32fc95bf658995f"} Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.068953 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-fm5cg" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.071185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rfwwx" event={"ID":"bd9dac02-de3d-43fb-9046-5280d8131d3b","Type":"ContainerStarted","Data":"c408b77bc077f9be36c4d4b23bfa8d6c851688b1b9c887ea6387b8205edda36b"} Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.129574 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-rfwwx" podStartSLOduration=3.129558222 podStartE2EDuration="3.129558222s" podCreationTimestamp="2026-03-20 13:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:18.128447634 +0000 UTC m=+937.638166600" watchObservedRunningTime="2026-03-20 13:37:18.129558222 +0000 UTC m=+937.639277188" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.132912 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-fm5cg" podStartSLOduration=3.132899765 podStartE2EDuration="3.132899765s" podCreationTimestamp="2026-03-20 13:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:18.098009347 +0000 UTC m=+937.607728313" watchObservedRunningTime="2026-03-20 13:37:18.132899765 +0000 UTC m=+937.642618731" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.157541 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6wcbv" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.288877 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wcbv"] Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.325619 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.325926 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mng7h" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="registry-server" containerID="cri-o://dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb" gracePeriod=2 Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.862831 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.966972 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities\") pod \"579f150c-66c2-4ea6-ad65-656ae172f27c\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.967063 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content\") pod \"579f150c-66c2-4ea6-ad65-656ae172f27c\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.967150 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566bc\" (UniqueName: \"kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc\") pod \"579f150c-66c2-4ea6-ad65-656ae172f27c\" (UID: \"579f150c-66c2-4ea6-ad65-656ae172f27c\") " Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.968170 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities" (OuterVolumeSpecName: "utilities") pod "579f150c-66c2-4ea6-ad65-656ae172f27c" (UID: "579f150c-66c2-4ea6-ad65-656ae172f27c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:18 crc kubenswrapper[4895]: I0320 13:37:18.973991 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc" (OuterVolumeSpecName: "kube-api-access-566bc") pod "579f150c-66c2-4ea6-ad65-656ae172f27c" (UID: "579f150c-66c2-4ea6-ad65-656ae172f27c"). InnerVolumeSpecName "kube-api-access-566bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.068539 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.068568 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-566bc\" (UniqueName: \"kubernetes.io/projected/579f150c-66c2-4ea6-ad65-656ae172f27c-kube-api-access-566bc\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.076222 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "579f150c-66c2-4ea6-ad65-656ae172f27c" (UID: "579f150c-66c2-4ea6-ad65-656ae172f27c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.081683 4895 generic.go:334] "Generic (PLEG): container finished" podID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerID="dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb" exitCode=0 Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.082737 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mng7h" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.084931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerDied","Data":"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb"} Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.085635 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mng7h" event={"ID":"579f150c-66c2-4ea6-ad65-656ae172f27c","Type":"ContainerDied","Data":"440a213fcabf35bcb4c85b95d4ff15f82bb7a13cfb04f4f413d0c31064699eef"} Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.085652 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.085675 4895 scope.go:117] "RemoveContainer" containerID="dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.112970 4895 scope.go:117] "RemoveContainer" containerID="875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.132933 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.140713 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mng7h"] Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.165538 4895 scope.go:117] "RemoveContainer" containerID="4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.172216 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579f150c-66c2-4ea6-ad65-656ae172f27c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.184738 4895 scope.go:117] "RemoveContainer" containerID="dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb" Mar 20 13:37:19 crc kubenswrapper[4895]: E0320 13:37:19.187315 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb\": container with ID starting with dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb not found: ID does not exist" containerID="dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.187373 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb"} err="failed to get container status \"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb\": rpc error: code = NotFound desc = could not find container \"dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb\": container with ID starting with dabb0ebd4836db0a27694676abfb9154bd9ecae5284728efe81c4c6e92d6caeb not found: ID does not exist" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.187431 4895 scope.go:117] "RemoveContainer" containerID="875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13" Mar 20 13:37:19 crc kubenswrapper[4895]: E0320 13:37:19.187985 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13\": container with ID starting with 875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13 not found: ID does not exist" containerID="875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.188019 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13"} err="failed to get container status \"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13\": rpc error: code = NotFound desc = could not find container \"875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13\": container with ID starting with 875b7942ee44885eb809ecbbab34d4712109c90e1004a068071dcaa2d1c80b13 not found: ID does not exist" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.188039 4895 scope.go:117] "RemoveContainer" containerID="4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b" Mar 20 13:37:19 crc kubenswrapper[4895]: E0320 13:37:19.188361 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b\": container with ID starting with 4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b not found: ID does not exist" containerID="4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.188423 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b"} err="failed to get container status \"4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b\": rpc error: code = NotFound desc = could not find container \"4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b\": container with ID starting with 4fe063b4b6c7f73637ca9d01550065d52166fbdc1e43e33c56eddfbe5808ac8b not found: ID does not exist" Mar 20 13:37:19 crc kubenswrapper[4895]: I0320 13:37:19.226355 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" path="/var/lib/kubelet/pods/579f150c-66c2-4ea6-ad65-656ae172f27c/volumes" Mar 20 13:37:26 crc kubenswrapper[4895]: I0320 13:37:26.269932 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-rfwwx" Mar 20 13:37:27 crc kubenswrapper[4895]: I0320 13:37:27.129526 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-fm5cg" Mar 20 13:37:28 crc kubenswrapper[4895]: I0320 13:37:28.165914 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" event={"ID":"9f90de29-5eaa-4d44-8988-4623460dc401","Type":"ContainerStarted","Data":"80c4d7f14da734d5ff5bf32f19c2b784afe5e8bbc4eac663494a2d1936f8e924"} Mar 20 13:37:28 crc kubenswrapper[4895]: I0320 13:37:28.166923 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:28 crc kubenswrapper[4895]: I0320 13:37:28.167457 4895 generic.go:334] "Generic (PLEG): container finished" podID="0d956767-71fa-4b4f-b113-659286aa149c" containerID="1c7d50723ca7fd25ea1ead739fb7583fdecf06bd6a271cca73f7903baffeed79" exitCode=0 Mar 20 13:37:28 crc kubenswrapper[4895]: I0320 13:37:28.167495 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerDied","Data":"1c7d50723ca7fd25ea1ead739fb7583fdecf06bd6a271cca73f7903baffeed79"} Mar 20 13:37:28 crc kubenswrapper[4895]: I0320 13:37:28.190574 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" podStartSLOduration=1.618635595 podStartE2EDuration="13.190559469s" podCreationTimestamp="2026-03-20 13:37:15 +0000 UTC" firstStartedPulling="2026-03-20 13:37:16.365980194 +0000 UTC m=+935.875699160" lastFinishedPulling="2026-03-20 13:37:27.937904068 +0000 UTC m=+947.447623034" observedRunningTime="2026-03-20 13:37:28.188635431 +0000 UTC m=+947.698354397" watchObservedRunningTime="2026-03-20 13:37:28.190559469 +0000 UTC m=+947.700278435" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.179497 4895 generic.go:334] "Generic (PLEG): container finished" podID="0d956767-71fa-4b4f-b113-659286aa149c" containerID="682678ff9a2ec5da25b8e430b286ba615d2e6275ff405fff4ab39684f1ff5315" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.179593 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerDied","Data":"682678ff9a2ec5da25b8e430b286ba615d2e6275ff405fff4ab39684f1ff5315"} Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.928617 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:29 crc kubenswrapper[4895]: E0320 13:37:29.929301 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="registry-server" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.929330 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="registry-server" Mar 20 13:37:29 crc kubenswrapper[4895]: E0320 13:37:29.929366 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="extract-utilities" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.929378 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="extract-utilities" Mar 20 13:37:29 crc kubenswrapper[4895]: E0320 13:37:29.929412 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="extract-content" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.929423 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="extract-content" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.929610 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="579f150c-66c2-4ea6-ad65-656ae172f27c" containerName="registry-server" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.930237 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.931949 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-45rjp" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.941457 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.941875 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:37:29 crc kubenswrapper[4895]: I0320 13:37:29.950742 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.006162 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfl9x\" (UniqueName: \"kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x\") pod \"openstack-operator-index-gczdd\" (UID: \"0cd0533e-236b-4d8b-9fac-bb33877dac26\") " pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.107130 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfl9x\" (UniqueName: \"kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x\") pod \"openstack-operator-index-gczdd\" (UID: \"0cd0533e-236b-4d8b-9fac-bb33877dac26\") " pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.126895 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfl9x\" (UniqueName: \"kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x\") pod \"openstack-operator-index-gczdd\" (UID: \"0cd0533e-236b-4d8b-9fac-bb33877dac26\") " pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.191340 4895 generic.go:334] "Generic (PLEG): container finished" podID="0d956767-71fa-4b4f-b113-659286aa149c" containerID="4e7cdd2e701183b41337a1d8ebd5dd625830a4e734ac5988a4b32d46d745af2c" exitCode=0 Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.191430 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerDied","Data":"4e7cdd2e701183b41337a1d8ebd5dd625830a4e734ac5988a4b32d46d745af2c"} Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.278927 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:30 crc kubenswrapper[4895]: I0320 13:37:30.554813 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.203705 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"39bd91cd44ab144fb84ad34e33814bf45ccf542275b9b2383012dd967f6d53ef"} Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.203996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"137232ece63db22ecab6a94d3050a602500c9483215221264d6314b0dc426b6a"} Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.204017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"5ba5a791b23fec9d6fadd29df060dc81b21b7fb8cc6ae900fd810eabd29f8d39"} Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.204034 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"ad65f43a794d90d37c636b93150c29a09d3f93714478caff1e6b613f4dd2919f"} Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.204049 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"d2e01c3264fa05b88f9f6c80143516588586c92ad961fd98dd7558018574616b"} Mar 20 13:37:31 crc kubenswrapper[4895]: I0320 13:37:31.205206 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gczdd" event={"ID":"0cd0533e-236b-4d8b-9fac-bb33877dac26","Type":"ContainerStarted","Data":"a75c3998ea9ba5eb0ed565ea05157273f8a8e104cfe302fcba196e1b8eda646c"} Mar 20 13:37:32 crc kubenswrapper[4895]: I0320 13:37:32.231277 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bg9zg" event={"ID":"0d956767-71fa-4b4f-b113-659286aa149c","Type":"ContainerStarted","Data":"80811560d395299c7078816a4be199f5de340a60697673868288955c83998fbd"} Mar 20 13:37:32 crc kubenswrapper[4895]: I0320 13:37:32.231663 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:32 crc kubenswrapper[4895]: I0320 13:37:32.263723 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bg9zg" podStartSLOduration=5.057489534 podStartE2EDuration="17.263700845s" podCreationTimestamp="2026-03-20 13:37:15 +0000 UTC" firstStartedPulling="2026-03-20 13:37:15.715860894 +0000 UTC m=+935.225579850" lastFinishedPulling="2026-03-20 13:37:27.922072195 +0000 UTC m=+947.431791161" observedRunningTime="2026-03-20 13:37:32.260002633 +0000 UTC m=+951.769721629" watchObservedRunningTime="2026-03-20 13:37:32.263700845 +0000 UTC m=+951.773419831" Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.105437 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.711434 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cgtq6"] Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.712835 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.736351 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cgtq6"] Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.864370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcb8z\" (UniqueName: \"kubernetes.io/projected/94b37e00-b08d-4f87-8e18-d758b22a4079-kube-api-access-wcb8z\") pod \"openstack-operator-index-cgtq6\" (UID: \"94b37e00-b08d-4f87-8e18-d758b22a4079\") " pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.965525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcb8z\" (UniqueName: \"kubernetes.io/projected/94b37e00-b08d-4f87-8e18-d758b22a4079-kube-api-access-wcb8z\") pod \"openstack-operator-index-cgtq6\" (UID: \"94b37e00-b08d-4f87-8e18-d758b22a4079\") " pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:33 crc kubenswrapper[4895]: I0320 13:37:33.983247 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcb8z\" (UniqueName: \"kubernetes.io/projected/94b37e00-b08d-4f87-8e18-d758b22a4079-kube-api-access-wcb8z\") pod \"openstack-operator-index-cgtq6\" (UID: \"94b37e00-b08d-4f87-8e18-d758b22a4079\") " pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.074318 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.263909 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gczdd" event={"ID":"0cd0533e-236b-4d8b-9fac-bb33877dac26","Type":"ContainerStarted","Data":"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156"} Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.264024 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-gczdd" podUID="0cd0533e-236b-4d8b-9fac-bb33877dac26" containerName="registry-server" containerID="cri-o://c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156" gracePeriod=2 Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.308635 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gczdd" podStartSLOduration=2.717639751 podStartE2EDuration="5.308614655s" podCreationTimestamp="2026-03-20 13:37:29 +0000 UTC" firstStartedPulling="2026-03-20 13:37:30.586229958 +0000 UTC m=+950.095948924" lastFinishedPulling="2026-03-20 13:37:33.177204852 +0000 UTC m=+952.686923828" observedRunningTime="2026-03-20 13:37:34.301781005 +0000 UTC m=+953.811499971" watchObservedRunningTime="2026-03-20 13:37:34.308614655 +0000 UTC m=+953.818333621" Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.488169 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cgtq6"] Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.653942 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.780156 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfl9x\" (UniqueName: \"kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x\") pod \"0cd0533e-236b-4d8b-9fac-bb33877dac26\" (UID: \"0cd0533e-236b-4d8b-9fac-bb33877dac26\") " Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.794422 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x" (OuterVolumeSpecName: "kube-api-access-wfl9x") pod "0cd0533e-236b-4d8b-9fac-bb33877dac26" (UID: "0cd0533e-236b-4d8b-9fac-bb33877dac26"). InnerVolumeSpecName "kube-api-access-wfl9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:34 crc kubenswrapper[4895]: I0320 13:37:34.882599 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfl9x\" (UniqueName: \"kubernetes.io/projected/0cd0533e-236b-4d8b-9fac-bb33877dac26-kube-api-access-wfl9x\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.271733 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cgtq6" event={"ID":"94b37e00-b08d-4f87-8e18-d758b22a4079","Type":"ContainerStarted","Data":"078844b6764cf77913319338537da93ff9966dca5aae745af5ad372523143d11"} Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.271790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cgtq6" event={"ID":"94b37e00-b08d-4f87-8e18-d758b22a4079","Type":"ContainerStarted","Data":"11082c2f864a032676cec98e928e253097fabcc59ab062bfdaa45d1659209b0a"} Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.274731 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cd0533e-236b-4d8b-9fac-bb33877dac26" containerID="c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156" exitCode=0 Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.274763 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gczdd" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.274773 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gczdd" event={"ID":"0cd0533e-236b-4d8b-9fac-bb33877dac26","Type":"ContainerDied","Data":"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156"} Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.274799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gczdd" event={"ID":"0cd0533e-236b-4d8b-9fac-bb33877dac26","Type":"ContainerDied","Data":"a75c3998ea9ba5eb0ed565ea05157273f8a8e104cfe302fcba196e1b8eda646c"} Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.274815 4895 scope.go:117] "RemoveContainer" containerID="c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.294986 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cgtq6" podStartSLOduration=2.247707779 podStartE2EDuration="2.294935703s" podCreationTimestamp="2026-03-20 13:37:33 +0000 UTC" firstStartedPulling="2026-03-20 13:37:34.50633756 +0000 UTC m=+954.016056516" lastFinishedPulling="2026-03-20 13:37:34.553565474 +0000 UTC m=+954.063284440" observedRunningTime="2026-03-20 13:37:35.292918392 +0000 UTC m=+954.802637358" watchObservedRunningTime="2026-03-20 13:37:35.294935703 +0000 UTC m=+954.804654679" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.299327 4895 scope.go:117] "RemoveContainer" containerID="c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156" Mar 20 13:37:35 crc kubenswrapper[4895]: E0320 13:37:35.299872 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156\": container with ID starting with c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156 not found: ID does not exist" containerID="c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.299940 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156"} err="failed to get container status \"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156\": rpc error: code = NotFound desc = could not find container \"c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156\": container with ID starting with c6235f56e6f1f1dbf5ac615da94ffa306b8d95725e36f6048ca046005747f156 not found: ID does not exist" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.307528 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.311215 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-gczdd"] Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.559130 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:35 crc kubenswrapper[4895]: I0320 13:37:35.606522 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:37 crc kubenswrapper[4895]: I0320 13:37:37.221641 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd0533e-236b-4d8b-9fac-bb33877dac26" path="/var/lib/kubelet/pods/0cd0533e-236b-4d8b-9fac-bb33877dac26/volumes" Mar 20 13:37:44 crc kubenswrapper[4895]: I0320 13:37:44.074846 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:44 crc kubenswrapper[4895]: I0320 13:37:44.075382 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:44 crc kubenswrapper[4895]: I0320 13:37:44.109415 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:44 crc kubenswrapper[4895]: I0320 13:37:44.384505 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cgtq6" Mar 20 13:37:45 crc kubenswrapper[4895]: I0320 13:37:45.566382 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bg9zg" Mar 20 13:37:46 crc kubenswrapper[4895]: I0320 13:37:46.119794 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5g6cb" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.297111 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.297673 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.462714 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2"] Mar 20 13:37:52 crc kubenswrapper[4895]: E0320 13:37:52.463085 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd0533e-236b-4d8b-9fac-bb33877dac26" containerName="registry-server" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.463111 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd0533e-236b-4d8b-9fac-bb33877dac26" containerName="registry-server" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.463297 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd0533e-236b-4d8b-9fac-bb33877dac26" containerName="registry-server" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.464529 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.468129 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cz7hz" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.472718 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2"] Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.532644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfgpg\" (UniqueName: \"kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.532704 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.532736 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.633672 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfgpg\" (UniqueName: \"kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.633733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.633754 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.634178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.634680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.655126 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfgpg\" (UniqueName: \"kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.782764 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:52 crc kubenswrapper[4895]: I0320 13:37:52.997085 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2"] Mar 20 13:37:53 crc kubenswrapper[4895]: I0320 13:37:53.402938 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerID="3b5878961ebf83fc3701137b41f44eaf0e29db290483e2d06b81e6bba3896f04" exitCode=0 Mar 20 13:37:53 crc kubenswrapper[4895]: I0320 13:37:53.402988 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" event={"ID":"0cbb9e45-1500-4b44-959c-2a9b4f0a587c","Type":"ContainerDied","Data":"3b5878961ebf83fc3701137b41f44eaf0e29db290483e2d06b81e6bba3896f04"} Mar 20 13:37:53 crc kubenswrapper[4895]: I0320 13:37:53.403017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" event={"ID":"0cbb9e45-1500-4b44-959c-2a9b4f0a587c","Type":"ContainerStarted","Data":"ddcbfd8cb66605f641738d7a3671165a402ac50a97d584a4c73c8fb8b3a67a52"} Mar 20 13:37:54 crc kubenswrapper[4895]: I0320 13:37:54.410704 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerID="4fbb9b88fc82bd35f4bdf4e721b08c65eb289c186a1c7e88eadcce4637460d9d" exitCode=0 Mar 20 13:37:54 crc kubenswrapper[4895]: I0320 13:37:54.410751 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" event={"ID":"0cbb9e45-1500-4b44-959c-2a9b4f0a587c","Type":"ContainerDied","Data":"4fbb9b88fc82bd35f4bdf4e721b08c65eb289c186a1c7e88eadcce4637460d9d"} Mar 20 13:37:55 crc kubenswrapper[4895]: I0320 13:37:55.422028 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerID="0188b28a4aebb51077d6068c291a974e12ab96661593897b823d507235e960dc" exitCode=0 Mar 20 13:37:55 crc kubenswrapper[4895]: I0320 13:37:55.422098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" event={"ID":"0cbb9e45-1500-4b44-959c-2a9b4f0a587c","Type":"ContainerDied","Data":"0188b28a4aebb51077d6068c291a974e12ab96661593897b823d507235e960dc"} Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.722945 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.888368 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfgpg\" (UniqueName: \"kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg\") pod \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.888458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util\") pod \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.888619 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle\") pod \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\" (UID: \"0cbb9e45-1500-4b44-959c-2a9b4f0a587c\") " Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.889687 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle" (OuterVolumeSpecName: "bundle") pod "0cbb9e45-1500-4b44-959c-2a9b4f0a587c" (UID: "0cbb9e45-1500-4b44-959c-2a9b4f0a587c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.897620 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg" (OuterVolumeSpecName: "kube-api-access-qfgpg") pod "0cbb9e45-1500-4b44-959c-2a9b4f0a587c" (UID: "0cbb9e45-1500-4b44-959c-2a9b4f0a587c"). InnerVolumeSpecName "kube-api-access-qfgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.913686 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util" (OuterVolumeSpecName: "util") pod "0cbb9e45-1500-4b44-959c-2a9b4f0a587c" (UID: "0cbb9e45-1500-4b44-959c-2a9b4f0a587c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.990226 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfgpg\" (UniqueName: \"kubernetes.io/projected/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-kube-api-access-qfgpg\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.990267 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:56 crc kubenswrapper[4895]: I0320 13:37:56.990277 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0cbb9e45-1500-4b44-959c-2a9b4f0a587c-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4895]: I0320 13:37:57.440191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" event={"ID":"0cbb9e45-1500-4b44-959c-2a9b4f0a587c","Type":"ContainerDied","Data":"ddcbfd8cb66605f641738d7a3671165a402ac50a97d584a4c73c8fb8b3a67a52"} Mar 20 13:37:57 crc kubenswrapper[4895]: I0320 13:37:57.440516 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddcbfd8cb66605f641738d7a3671165a402ac50a97d584a4c73c8fb8b3a67a52" Mar 20 13:37:57 crc kubenswrapper[4895]: I0320 13:37:57.440261 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.144675 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-smz2g"] Mar 20 13:38:00 crc kubenswrapper[4895]: E0320 13:38:00.145525 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="util" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.145554 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="util" Mar 20 13:38:00 crc kubenswrapper[4895]: E0320 13:38:00.145603 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="pull" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.145615 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="pull" Mar 20 13:38:00 crc kubenswrapper[4895]: E0320 13:38:00.145634 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.145646 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.145866 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbb9e45-1500-4b44-959c-2a9b4f0a587c" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.146615 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.150170 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.150204 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.150378 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.153566 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-smz2g"] Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.331679 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbw24\" (UniqueName: \"kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24\") pod \"auto-csr-approver-29566898-smz2g\" (UID: \"d152aade-646d-44e1-b484-b59672468f56\") " pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.433659 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbw24\" (UniqueName: \"kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24\") pod \"auto-csr-approver-29566898-smz2g\" (UID: \"d152aade-646d-44e1-b484-b59672468f56\") " pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.461548 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbw24\" (UniqueName: \"kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24\") pod \"auto-csr-approver-29566898-smz2g\" (UID: \"d152aade-646d-44e1-b484-b59672468f56\") " pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.482741 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:00 crc kubenswrapper[4895]: I0320 13:38:00.897515 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-smz2g"] Mar 20 13:38:00 crc kubenswrapper[4895]: W0320 13:38:00.911908 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd152aade_646d_44e1_b484_b59672468f56.slice/crio-b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4 WatchSource:0}: Error finding container b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4: Status 404 returned error can't find the container with id b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4 Mar 20 13:38:01 crc kubenswrapper[4895]: I0320 13:38:01.480093 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-smz2g" event={"ID":"d152aade-646d-44e1-b484-b59672468f56","Type":"ContainerStarted","Data":"b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4"} Mar 20 13:38:02 crc kubenswrapper[4895]: I0320 13:38:02.489536 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-smz2g" event={"ID":"d152aade-646d-44e1-b484-b59672468f56","Type":"ContainerStarted","Data":"5b77b0afde30292649e2437e895cf828f3d3d843a6e9b9834667d43b48e1903b"} Mar 20 13:38:02 crc kubenswrapper[4895]: I0320 13:38:02.509423 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566898-smz2g" podStartSLOduration=1.189294439 podStartE2EDuration="2.509382082s" podCreationTimestamp="2026-03-20 13:38:00 +0000 UTC" firstStartedPulling="2026-03-20 13:38:00.915167456 +0000 UTC m=+980.424886462" lastFinishedPulling="2026-03-20 13:38:02.235255139 +0000 UTC m=+981.744974105" observedRunningTime="2026-03-20 13:38:02.507655089 +0000 UTC m=+982.017374085" watchObservedRunningTime="2026-03-20 13:38:02.509382082 +0000 UTC m=+982.019101048" Mar 20 13:38:03 crc kubenswrapper[4895]: I0320 13:38:03.496643 4895 generic.go:334] "Generic (PLEG): container finished" podID="d152aade-646d-44e1-b484-b59672468f56" containerID="5b77b0afde30292649e2437e895cf828f3d3d843a6e9b9834667d43b48e1903b" exitCode=0 Mar 20 13:38:03 crc kubenswrapper[4895]: I0320 13:38:03.496701 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-smz2g" event={"ID":"d152aade-646d-44e1-b484-b59672468f56","Type":"ContainerDied","Data":"5b77b0afde30292649e2437e895cf828f3d3d843a6e9b9834667d43b48e1903b"} Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.713422 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m"] Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.714441 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.720196 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-lvclk" Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.772481 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m"] Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.826330 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.898830 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rcf\" (UniqueName: \"kubernetes.io/projected/d45d9a5f-9ee2-494a-9c05-5fd7cc094da4-kube-api-access-78rcf\") pod \"openstack-operator-controller-init-6f7459b8bf-lvm5m\" (UID: \"d45d9a5f-9ee2-494a-9c05-5fd7cc094da4\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:04 crc kubenswrapper[4895]: I0320 13:38:04.999779 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbw24\" (UniqueName: \"kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24\") pod \"d152aade-646d-44e1-b484-b59672468f56\" (UID: \"d152aade-646d-44e1-b484-b59672468f56\") " Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.000120 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rcf\" (UniqueName: \"kubernetes.io/projected/d45d9a5f-9ee2-494a-9c05-5fd7cc094da4-kube-api-access-78rcf\") pod \"openstack-operator-controller-init-6f7459b8bf-lvm5m\" (UID: \"d45d9a5f-9ee2-494a-9c05-5fd7cc094da4\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.008747 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24" (OuterVolumeSpecName: "kube-api-access-tbw24") pod "d152aade-646d-44e1-b484-b59672468f56" (UID: "d152aade-646d-44e1-b484-b59672468f56"). InnerVolumeSpecName "kube-api-access-tbw24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.016275 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rcf\" (UniqueName: \"kubernetes.io/projected/d45d9a5f-9ee2-494a-9c05-5fd7cc094da4-kube-api-access-78rcf\") pod \"openstack-operator-controller-init-6f7459b8bf-lvm5m\" (UID: \"d45d9a5f-9ee2-494a-9c05-5fd7cc094da4\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.042761 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.101470 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbw24\" (UniqueName: \"kubernetes.io/projected/d152aade-646d-44e1-b484-b59672468f56-kube-api-access-tbw24\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.511029 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m"] Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.512829 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-smz2g" event={"ID":"d152aade-646d-44e1-b484-b59672468f56","Type":"ContainerDied","Data":"b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4"} Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.512994 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b23275a1dfd5c35cce5161c5c35671e50b5b3030a7b2c5ce47b10bbb08fbeae4" Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.512875 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-smz2g" Mar 20 13:38:05 crc kubenswrapper[4895]: W0320 13:38:05.518114 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45d9a5f_9ee2_494a_9c05_5fd7cc094da4.slice/crio-eaeb83f14566257abda672350902ac90b7cffde213c70ec6a8f163c3ecb9cfe3 WatchSource:0}: Error finding container eaeb83f14566257abda672350902ac90b7cffde213c70ec6a8f163c3ecb9cfe3: Status 404 returned error can't find the container with id eaeb83f14566257abda672350902ac90b7cffde213c70ec6a8f163c3ecb9cfe3 Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.555422 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-vd7lc"] Mar 20 13:38:05 crc kubenswrapper[4895]: I0320 13:38:05.560731 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-vd7lc"] Mar 20 13:38:06 crc kubenswrapper[4895]: I0320 13:38:06.524461 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" event={"ID":"d45d9a5f-9ee2-494a-9c05-5fd7cc094da4","Type":"ContainerStarted","Data":"eaeb83f14566257abda672350902ac90b7cffde213c70ec6a8f163c3ecb9cfe3"} Mar 20 13:38:07 crc kubenswrapper[4895]: I0320 13:38:07.337000 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0601c4-bbf5-49e4-bdc8-bd482c79f041" path="/var/lib/kubelet/pods/9c0601c4-bbf5-49e4-bdc8-bd482c79f041/volumes" Mar 20 13:38:11 crc kubenswrapper[4895]: I0320 13:38:11.561639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" event={"ID":"d45d9a5f-9ee2-494a-9c05-5fd7cc094da4","Type":"ContainerStarted","Data":"f8de90805726c57f6b449e9cd6f46588898bbaf6bd54ae45fdf3c35f880ff3b4"} Mar 20 13:38:11 crc kubenswrapper[4895]: I0320 13:38:11.562202 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:11 crc kubenswrapper[4895]: I0320 13:38:11.589727 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" podStartSLOduration=2.689384946 podStartE2EDuration="7.589706254s" podCreationTimestamp="2026-03-20 13:38:04 +0000 UTC" firstStartedPulling="2026-03-20 13:38:05.521306171 +0000 UTC m=+985.031025127" lastFinishedPulling="2026-03-20 13:38:10.421627479 +0000 UTC m=+989.931346435" observedRunningTime="2026-03-20 13:38:11.585063867 +0000 UTC m=+991.094782863" watchObservedRunningTime="2026-03-20 13:38:11.589706254 +0000 UTC m=+991.099425220" Mar 20 13:38:15 crc kubenswrapper[4895]: I0320 13:38:15.046293 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-lvm5m" Mar 20 13:38:22 crc kubenswrapper[4895]: I0320 13:38:22.296848 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:22 crc kubenswrapper[4895]: I0320 13:38:22.298021 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:43 crc kubenswrapper[4895]: I0320 13:38:43.333030 4895 scope.go:117] "RemoveContainer" containerID="240f0bddc8821c2a2acebda427b3d73cee92643ee86ba86a641ecfa039f24f8d" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.878306 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp"] Mar 20 13:38:51 crc kubenswrapper[4895]: E0320 13:38:51.879037 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d152aade-646d-44e1-b484-b59672468f56" containerName="oc" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.879050 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d152aade-646d-44e1-b484-b59672468f56" containerName="oc" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.879180 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d152aade-646d-44e1-b484-b59672468f56" containerName="oc" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.879704 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.881552 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-b4dq4" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.888696 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.889799 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.892655 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw7vp\" (UniqueName: \"kubernetes.io/projected/45950a96-a521-4429-b7d1-71efa644a087-kube-api-access-bw7vp\") pod \"barbican-operator-controller-manager-59bc569d95-h6qcp\" (UID: \"45950a96-a521-4429-b7d1-71efa644a087\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.892775 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qdwzp" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.892916 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwzth\" (UniqueName: \"kubernetes.io/projected/810c2ef6-f5e6-4003-b01e-76e1edbbe452-kube-api-access-rwzth\") pod \"cinder-operator-controller-manager-8d58dc466-8tttj\" (UID: \"810c2ef6-f5e6-4003-b01e-76e1edbbe452\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.902166 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.910080 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.927488 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.928364 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.932702 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nwhtw" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.944151 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.948493 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk"] Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.949213 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.969053 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5x78f" Mar 20 13:38:51 crc kubenswrapper[4895]: I0320 13:38:51.982078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.010168 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwzth\" (UniqueName: \"kubernetes.io/projected/810c2ef6-f5e6-4003-b01e-76e1edbbe452-kube-api-access-rwzth\") pod \"cinder-operator-controller-manager-8d58dc466-8tttj\" (UID: \"810c2ef6-f5e6-4003-b01e-76e1edbbe452\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.010230 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw7vp\" (UniqueName: \"kubernetes.io/projected/45950a96-a521-4429-b7d1-71efa644a087-kube-api-access-bw7vp\") pod \"barbican-operator-controller-manager-59bc569d95-h6qcp\" (UID: \"45950a96-a521-4429-b7d1-71efa644a087\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.016541 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.018117 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.030653 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.031187 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-pxpg8" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.041984 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw7vp\" (UniqueName: \"kubernetes.io/projected/45950a96-a521-4429-b7d1-71efa644a087-kube-api-access-bw7vp\") pod \"barbican-operator-controller-manager-59bc569d95-h6qcp\" (UID: \"45950a96-a521-4429-b7d1-71efa644a087\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.054756 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwzth\" (UniqueName: \"kubernetes.io/projected/810c2ef6-f5e6-4003-b01e-76e1edbbe452-kube-api-access-rwzth\") pod \"cinder-operator-controller-manager-8d58dc466-8tttj\" (UID: \"810c2ef6-f5e6-4003-b01e-76e1edbbe452\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.082262 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.083100 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.092040 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.100842 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-d62dx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.106533 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.111594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8b26\" (UniqueName: \"kubernetes.io/projected/e59747be-3214-43b9-b75b-88b8e7e71484-kube-api-access-m8b26\") pod \"glance-operator-controller-manager-79df6bcc97-vvwnk\" (UID: \"e59747be-3214-43b9-b75b-88b8e7e71484\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.111630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdp2n\" (UniqueName: \"kubernetes.io/projected/8ec4bb36-473c-4103-bfeb-10e8df206b9a-kube-api-access-pdp2n\") pod \"designate-operator-controller-manager-588d4d986b-6rrsx\" (UID: \"8ec4bb36-473c-4103-bfeb-10e8df206b9a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.114253 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.115347 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.116991 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.118713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.118729 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2hf6c" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.122662 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.133028 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hf9bx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.148656 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.149817 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.157237 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6krgw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.172666 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.189727 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.196704 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.208450 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.208835 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.209296 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.216508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfktk\" (UniqueName: \"kubernetes.io/projected/9d9feeae-ff51-432c-a4a4-e375d743f0b3-kube-api-access-gfktk\") pod \"keystone-operator-controller-manager-768b96df4c-nsh2d\" (UID: \"9d9feeae-ff51-432c-a4a4-e375d743f0b3\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.216562 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qtd\" (UniqueName: \"kubernetes.io/projected/0039adb6-7c13-414b-bbd6-25e759da85b7-kube-api-access-g5qtd\") pod \"horizon-operator-controller-manager-8464cc45fb-x55wz\" (UID: \"0039adb6-7c13-414b-bbd6-25e759da85b7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.217126 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtrp\" (UniqueName: \"kubernetes.io/projected/90753829-7cac-4f8f-8aa5-086430d0eafa-kube-api-access-dgtrp\") pod \"heat-operator-controller-manager-67dd5f86f5-f4sjs\" (UID: \"90753829-7cac-4f8f-8aa5-086430d0eafa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.217179 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8b26\" (UniqueName: \"kubernetes.io/projected/e59747be-3214-43b9-b75b-88b8e7e71484-kube-api-access-m8b26\") pod \"glance-operator-controller-manager-79df6bcc97-vvwnk\" (UID: \"e59747be-3214-43b9-b75b-88b8e7e71484\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.217202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdp2n\" (UniqueName: \"kubernetes.io/projected/8ec4bb36-473c-4103-bfeb-10e8df206b9a-kube-api-access-pdp2n\") pod \"designate-operator-controller-manager-588d4d986b-6rrsx\" (UID: \"8ec4bb36-473c-4103-bfeb-10e8df206b9a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.218668 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.231380 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q842c" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.240064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdp2n\" (UniqueName: \"kubernetes.io/projected/8ec4bb36-473c-4103-bfeb-10e8df206b9a-kube-api-access-pdp2n\") pod \"designate-operator-controller-manager-588d4d986b-6rrsx\" (UID: \"8ec4bb36-473c-4103-bfeb-10e8df206b9a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.244434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8b26\" (UniqueName: \"kubernetes.io/projected/e59747be-3214-43b9-b75b-88b8e7e71484-kube-api-access-m8b26\") pod \"glance-operator-controller-manager-79df6bcc97-vvwnk\" (UID: \"e59747be-3214-43b9-b75b-88b8e7e71484\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.244729 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.264641 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.265457 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.284476 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ms5tv" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.289456 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.290338 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.303759 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.303814 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.303855 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.304487 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.304544 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6" gracePeriod=600 Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.304617 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8xbjz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.304998 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323371 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtrp\" (UniqueName: \"kubernetes.io/projected/90753829-7cac-4f8f-8aa5-086430d0eafa-kube-api-access-dgtrp\") pod \"heat-operator-controller-manager-67dd5f86f5-f4sjs\" (UID: \"90753829-7cac-4f8f-8aa5-086430d0eafa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323470 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfktk\" (UniqueName: \"kubernetes.io/projected/9d9feeae-ff51-432c-a4a4-e375d743f0b3-kube-api-access-gfktk\") pod \"keystone-operator-controller-manager-768b96df4c-nsh2d\" (UID: \"9d9feeae-ff51-432c-a4a4-e375d743f0b3\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323521 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5mcp\" (UniqueName: \"kubernetes.io/projected/7d7e8ef8-065c-40c0-a396-915b7efdd1a0-kube-api-access-z5mcp\") pod \"manila-operator-controller-manager-55f864c847-bqdrg\" (UID: \"7d7e8ef8-065c-40c0-a396-915b7efdd1a0\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qtd\" (UniqueName: \"kubernetes.io/projected/0039adb6-7c13-414b-bbd6-25e759da85b7-kube-api-access-g5qtd\") pod \"horizon-operator-controller-manager-8464cc45fb-x55wz\" (UID: \"0039adb6-7c13-414b-bbd6-25e759da85b7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxghd\" (UniqueName: \"kubernetes.io/projected/e386d39f-7654-4d1d-84fc-6796309ac427-kube-api-access-vxghd\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323681 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljc8s\" (UniqueName: \"kubernetes.io/projected/4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa-kube-api-access-ljc8s\") pod \"ironic-operator-controller-manager-6f787dddc9-pk85v\" (UID: \"4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.323756 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.335469 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.363492 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.383969 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qtd\" (UniqueName: \"kubernetes.io/projected/0039adb6-7c13-414b-bbd6-25e759da85b7-kube-api-access-g5qtd\") pod \"horizon-operator-controller-manager-8464cc45fb-x55wz\" (UID: \"0039adb6-7c13-414b-bbd6-25e759da85b7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.392258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfktk\" (UniqueName: \"kubernetes.io/projected/9d9feeae-ff51-432c-a4a4-e375d743f0b3-kube-api-access-gfktk\") pod \"keystone-operator-controller-manager-768b96df4c-nsh2d\" (UID: \"9d9feeae-ff51-432c-a4a4-e375d743f0b3\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.395583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtrp\" (UniqueName: \"kubernetes.io/projected/90753829-7cac-4f8f-8aa5-086430d0eafa-kube-api-access-dgtrp\") pod \"heat-operator-controller-manager-67dd5f86f5-f4sjs\" (UID: \"90753829-7cac-4f8f-8aa5-086430d0eafa\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.404494 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.405327 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.409678 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.413750 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2fh8q" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.416606 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.423434 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.424227 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5mcp\" (UniqueName: \"kubernetes.io/projected/7d7e8ef8-065c-40c0-a396-915b7efdd1a0-kube-api-access-z5mcp\") pod \"manila-operator-controller-manager-55f864c847-bqdrg\" (UID: \"7d7e8ef8-065c-40c0-a396-915b7efdd1a0\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425703 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxghd\" (UniqueName: \"kubernetes.io/projected/e386d39f-7654-4d1d-84fc-6796309ac427-kube-api-access-vxghd\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzxl\" (UniqueName: \"kubernetes.io/projected/de58ceb7-b3dd-487f-95eb-48d02a0accc3-kube-api-access-8jzxl\") pod \"neutron-operator-controller-manager-767865f676-xvkkg\" (UID: \"de58ceb7-b3dd-487f-95eb-48d02a0accc3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425789 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqqm\" (UniqueName: \"kubernetes.io/projected/7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c-kube-api-access-nsqqm\") pod \"nova-operator-controller-manager-5d488d59fb-7gt5d\" (UID: \"7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425808 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljc8s\" (UniqueName: \"kubernetes.io/projected/4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa-kube-api-access-ljc8s\") pod \"ironic-operator-controller-manager-6f787dddc9-pk85v\" (UID: \"4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425829 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh57z\" (UniqueName: \"kubernetes.io/projected/0d066703-200f-472a-b768-f6aef5eb347f-kube-api-access-qh57z\") pod \"mariadb-operator-controller-manager-67ccfc9778-c2kgq\" (UID: \"0d066703-200f-472a-b768-f6aef5eb347f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssvz\" (UniqueName: \"kubernetes.io/projected/73e2c644-bdbd-4769-946a-4e2111a28326-kube-api-access-bssvz\") pod \"octavia-operator-controller-manager-5b9f45d989-8nddx\" (UID: \"73e2c644-bdbd-4769-946a-4e2111a28326\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.425874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.425962 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.426000 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:52.925984776 +0000 UTC m=+1032.435703742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.428780 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rkcsk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.440578 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.465170 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxghd\" (UniqueName: \"kubernetes.io/projected/e386d39f-7654-4d1d-84fc-6796309ac427-kube-api-access-vxghd\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.476517 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.482361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljc8s\" (UniqueName: \"kubernetes.io/projected/4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa-kube-api-access-ljc8s\") pod \"ironic-operator-controller-manager-6f787dddc9-pk85v\" (UID: \"4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.490738 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5mcp\" (UniqueName: \"kubernetes.io/projected/7d7e8ef8-065c-40c0-a396-915b7efdd1a0-kube-api-access-z5mcp\") pod \"manila-operator-controller-manager-55f864c847-bqdrg\" (UID: \"7d7e8ef8-065c-40c0-a396-915b7efdd1a0\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.504578 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.526818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh57z\" (UniqueName: \"kubernetes.io/projected/0d066703-200f-472a-b768-f6aef5eb347f-kube-api-access-qh57z\") pod \"mariadb-operator-controller-manager-67ccfc9778-c2kgq\" (UID: \"0d066703-200f-472a-b768-f6aef5eb347f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.526861 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssvz\" (UniqueName: \"kubernetes.io/projected/73e2c644-bdbd-4769-946a-4e2111a28326-kube-api-access-bssvz\") pod \"octavia-operator-controller-manager-5b9f45d989-8nddx\" (UID: \"73e2c644-bdbd-4769-946a-4e2111a28326\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.526939 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzxl\" (UniqueName: \"kubernetes.io/projected/de58ceb7-b3dd-487f-95eb-48d02a0accc3-kube-api-access-8jzxl\") pod \"neutron-operator-controller-manager-767865f676-xvkkg\" (UID: \"de58ceb7-b3dd-487f-95eb-48d02a0accc3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.526964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqqm\" (UniqueName: \"kubernetes.io/projected/7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c-kube-api-access-nsqqm\") pod \"nova-operator-controller-manager-5d488d59fb-7gt5d\" (UID: \"7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.540849 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-br56m"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.541687 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.547963 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-cpdmx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.549975 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.550804 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.553145 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8t9wv" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.562796 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssvz\" (UniqueName: \"kubernetes.io/projected/73e2c644-bdbd-4769-946a-4e2111a28326-kube-api-access-bssvz\") pod \"octavia-operator-controller-manager-5b9f45d989-8nddx\" (UID: \"73e2c644-bdbd-4769-946a-4e2111a28326\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.568084 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-br56m"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.568938 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh57z\" (UniqueName: \"kubernetes.io/projected/0d066703-200f-472a-b768-f6aef5eb347f-kube-api-access-qh57z\") pod \"mariadb-operator-controller-manager-67ccfc9778-c2kgq\" (UID: \"0d066703-200f-472a-b768-f6aef5eb347f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.582024 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.586655 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqqm\" (UniqueName: \"kubernetes.io/projected/7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c-kube-api-access-nsqqm\") pod \"nova-operator-controller-manager-5d488d59fb-7gt5d\" (UID: \"7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.586659 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzxl\" (UniqueName: \"kubernetes.io/projected/de58ceb7-b3dd-487f-95eb-48d02a0accc3-kube-api-access-8jzxl\") pod \"neutron-operator-controller-manager-767865f676-xvkkg\" (UID: \"de58ceb7-b3dd-487f-95eb-48d02a0accc3\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.588298 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.590133 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.614940 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.615866 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4rr9s" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.619901 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.634508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.722278 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.749056 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.750472 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprqt\" (UniqueName: \"kubernetes.io/projected/680cd993-89dd-47f4-8555-b49ff8293a76-kube-api-access-lprqt\") pod \"placement-operator-controller-manager-5784578c99-f6m2m\" (UID: \"680cd993-89dd-47f4-8555-b49ff8293a76\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.750508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm8rf\" (UniqueName: \"kubernetes.io/projected/de5694d4-a796-46ee-9f84-4b9d35475f27-kube-api-access-cm8rf\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.750528 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66jmz\" (UniqueName: \"kubernetes.io/projected/f8f4d668-e8ad-4c6c-9107-6221569d3079-kube-api-access-66jmz\") pod \"ovn-operator-controller-manager-884679f54-br56m\" (UID: \"f8f4d668-e8ad-4c6c-9107-6221569d3079\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.750566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.751861 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gd46c" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.755259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.758141 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.767550 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.783873 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.785276 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.788637 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4j8kk" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.790488 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.799994 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.802229 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.809136 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.812720 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hvgnz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.815322 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.824115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.840513 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.841590 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.843629 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rnf4b" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.848519 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.851294 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb72p\" (UniqueName: \"kubernetes.io/projected/e9c5c274-21be-4e53-99f7-d1ab4f352142-kube-api-access-sb72p\") pod \"swift-operator-controller-manager-c674c5965-6xwg6\" (UID: \"e9c5c274-21be-4e53-99f7-d1ab4f352142\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.851362 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lprqt\" (UniqueName: \"kubernetes.io/projected/680cd993-89dd-47f4-8555-b49ff8293a76-kube-api-access-lprqt\") pod \"placement-operator-controller-manager-5784578c99-f6m2m\" (UID: \"680cd993-89dd-47f4-8555-b49ff8293a76\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.851409 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm8rf\" (UniqueName: \"kubernetes.io/projected/de5694d4-a796-46ee-9f84-4b9d35475f27-kube-api-access-cm8rf\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.851432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66jmz\" (UniqueName: \"kubernetes.io/projected/f8f4d668-e8ad-4c6c-9107-6221569d3079-kube-api-access-66jmz\") pod \"ovn-operator-controller-manager-884679f54-br56m\" (UID: \"f8f4d668-e8ad-4c6c-9107-6221569d3079\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.851461 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.851577 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.851612 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:53.351600625 +0000 UTC m=+1032.861319591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.853580 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.880964 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6" exitCode=0 Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.881007 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6"} Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.881043 4895 scope.go:117] "RemoveContainer" containerID="fdf26e1b03bf143f1004ffa2b193777b6d4fb9ca12cb442ec95767fe44f2fb85" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.894206 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.897459 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.900899 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ngch4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.900961 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.900964 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66jmz\" (UniqueName: \"kubernetes.io/projected/f8f4d668-e8ad-4c6c-9107-6221569d3079-kube-api-access-66jmz\") pod \"ovn-operator-controller-manager-884679f54-br56m\" (UID: \"f8f4d668-e8ad-4c6c-9107-6221569d3079\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.901059 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.901093 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.901876 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprqt\" (UniqueName: \"kubernetes.io/projected/680cd993-89dd-47f4-8555-b49ff8293a76-kube-api-access-lprqt\") pod \"placement-operator-controller-manager-5784578c99-f6m2m\" (UID: \"680cd993-89dd-47f4-8555-b49ff8293a76\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.903748 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm8rf\" (UniqueName: \"kubernetes.io/projected/de5694d4-a796-46ee-9f84-4b9d35475f27-kube-api-access-cm8rf\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.916845 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.924204 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2"] Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.924308 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.926075 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cjhnq" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.932350 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.951807 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.953043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8n86\" (UniqueName: \"kubernetes.io/projected/1d3b843f-4b33-455f-9d52-6a0267d370cb-kube-api-access-j8n86\") pod \"watcher-operator-controller-manager-6c4d75f7f9-8xlbt\" (UID: \"1d3b843f-4b33-455f-9d52-6a0267d370cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.953085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb72p\" (UniqueName: \"kubernetes.io/projected/e9c5c274-21be-4e53-99f7-d1ab4f352142-kube-api-access-sb72p\") pod \"swift-operator-controller-manager-c674c5965-6xwg6\" (UID: \"e9c5c274-21be-4e53-99f7-d1ab4f352142\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.953269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwlh\" (UniqueName: \"kubernetes.io/projected/4ce923f6-b8eb-4461-a222-0af773470e76-kube-api-access-pbwlh\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-rbm9d\" (UID: \"4ce923f6-b8eb-4461-a222-0af773470e76\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.953371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796gx\" (UniqueName: \"kubernetes.io/projected/cc9f95f5-a6fd-4638-989e-3dff592f5022-kube-api-access-796gx\") pod \"test-operator-controller-manager-5c5cb9c4d7-ppltl\" (UID: \"cc9f95f5-a6fd-4638-989e-3dff592f5022\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.953425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.953576 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: E0320 13:38:52.953618 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:53.95360585 +0000 UTC m=+1033.463324816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:52 crc kubenswrapper[4895]: I0320 13:38:52.971976 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb72p\" (UniqueName: \"kubernetes.io/projected/e9c5c274-21be-4e53-99f7-d1ab4f352142-kube-api-access-sb72p\") pod \"swift-operator-controller-manager-c674c5965-6xwg6\" (UID: \"e9c5c274-21be-4e53-99f7-d1ab4f352142\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.023922 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp"] Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.036653 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45950a96_a521_4429_b7d1_71efa644a087.slice/crio-6d966b1faaf0a5e43c1155ae6764fd735067bcece2b02c7b66332a579d0257c1 WatchSource:0}: Error finding container 6d966b1faaf0a5e43c1155ae6764fd735067bcece2b02c7b66332a579d0257c1: Status 404 returned error can't find the container with id 6d966b1faaf0a5e43c1155ae6764fd735067bcece2b02c7b66332a579d0257c1 Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.052199 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058359 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796gx\" (UniqueName: \"kubernetes.io/projected/cc9f95f5-a6fd-4638-989e-3dff592f5022-kube-api-access-796gx\") pod \"test-operator-controller-manager-5c5cb9c4d7-ppltl\" (UID: \"cc9f95f5-a6fd-4638-989e-3dff592f5022\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058481 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvj5r\" (UniqueName: \"kubernetes.io/projected/43459d05-1aac-46b1-b690-1b8c948bbb07-kube-api-access-kvj5r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c2cz2\" (UID: \"43459d05-1aac-46b1-b690-1b8c948bbb07\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058523 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058547 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8n86\" (UniqueName: \"kubernetes.io/projected/1d3b843f-4b33-455f-9d52-6a0267d370cb-kube-api-access-j8n86\") pod \"watcher-operator-controller-manager-6c4d75f7f9-8xlbt\" (UID: \"1d3b843f-4b33-455f-9d52-6a0267d370cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058611 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csx8\" (UniqueName: \"kubernetes.io/projected/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-kube-api-access-7csx8\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058761 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.058809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwlh\" (UniqueName: \"kubernetes.io/projected/4ce923f6-b8eb-4461-a222-0af773470e76-kube-api-access-pbwlh\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-rbm9d\" (UID: \"4ce923f6-b8eb-4461-a222-0af773470e76\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.080133 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8n86\" (UniqueName: \"kubernetes.io/projected/1d3b843f-4b33-455f-9d52-6a0267d370cb-kube-api-access-j8n86\") pod \"watcher-operator-controller-manager-6c4d75f7f9-8xlbt\" (UID: \"1d3b843f-4b33-455f-9d52-6a0267d370cb\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.081258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.081577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwlh\" (UniqueName: \"kubernetes.io/projected/4ce923f6-b8eb-4461-a222-0af773470e76-kube-api-access-pbwlh\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-rbm9d\" (UID: \"4ce923f6-b8eb-4461-a222-0af773470e76\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.093279 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796gx\" (UniqueName: \"kubernetes.io/projected/cc9f95f5-a6fd-4638-989e-3dff592f5022-kube-api-access-796gx\") pod \"test-operator-controller-manager-5c5cb9c4d7-ppltl\" (UID: \"cc9f95f5-a6fd-4638-989e-3dff592f5022\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.161040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.161325 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvj5r\" (UniqueName: \"kubernetes.io/projected/43459d05-1aac-46b1-b690-1b8c948bbb07-kube-api-access-kvj5r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c2cz2\" (UID: \"43459d05-1aac-46b1-b690-1b8c948bbb07\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.161661 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.161704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csx8\" (UniqueName: \"kubernetes.io/projected/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-kube-api-access-7csx8\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.161959 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.161999 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:53.66198539 +0000 UTC m=+1033.171704356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.162823 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.162915 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:53.662894553 +0000 UTC m=+1033.172613509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.189271 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvj5r\" (UniqueName: \"kubernetes.io/projected/43459d05-1aac-46b1-b690-1b8c948bbb07-kube-api-access-kvj5r\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c2cz2\" (UID: \"43459d05-1aac-46b1-b690-1b8c948bbb07\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.200963 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csx8\" (UniqueName: \"kubernetes.io/projected/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-kube-api-access-7csx8\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.295227 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.300846 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.305995 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.321285 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.364727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.365640 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.365693 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:54.365675813 +0000 UTC m=+1033.875394779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.394680 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.617420 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.656296 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.673195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.673326 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.674004 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.674034 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.674127 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:54.674092709 +0000 UTC m=+1034.183811675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.674349 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:54.674323675 +0000 UTC m=+1034.184042641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.693788 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg"] Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.704113 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59747be_3214_43b9_b75b_88b8e7e71484.slice/crio-4e9a0e48b56a755b934f6ab3a55460da2993a1a9fde72845e69c9f244aa99128 WatchSource:0}: Error finding container 4e9a0e48b56a755b934f6ab3a55460da2993a1a9fde72845e69c9f244aa99128: Status 404 returned error can't find the container with id 4e9a0e48b56a755b934f6ab3a55460da2993a1a9fde72845e69c9f244aa99128 Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.709383 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.713861 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.720444 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.830093 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-br56m"] Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.838930 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f4d668_e8ad_4c6c_9107_6221569d3079.slice/crio-c990d80fa6a18592a7903e1e81b6382edd2499c6e1cd944f809988354953c59c WatchSource:0}: Error finding container c990d80fa6a18592a7903e1e81b6382edd2499c6e1cd944f809988354953c59c: Status 404 returned error can't find the container with id c990d80fa6a18592a7903e1e81b6382edd2499c6e1cd944f809988354953c59c Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.840374 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.848308 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.853826 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq"] Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.857245 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c144b6f_b36e_442a_8aa8_8ffa93bf9eaa.slice/crio-1fb0804c9d8895c85340606497c86ea36fd311822749e99456b4f50eee83b97e WatchSource:0}: Error finding container 1fb0804c9d8895c85340606497c86ea36fd311822749e99456b4f50eee83b97e: Status 404 returned error can't find the container with id 1fb0804c9d8895c85340606497c86ea36fd311822749e99456b4f50eee83b97e Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.859004 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde58ceb7_b3dd_487f_95eb_48d02a0accc3.slice/crio-ec7f66d163b97fe23af88a55b5baf705b5a1a017dba0823948775adec4f0e57e WatchSource:0}: Error finding container ec7f66d163b97fe23af88a55b5baf705b5a1a017dba0823948775adec4f0e57e: Status 404 returned error can't find the container with id ec7f66d163b97fe23af88a55b5baf705b5a1a017dba0823948775adec4f0e57e Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.888966 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" event={"ID":"810c2ef6-f5e6-4003-b01e-76e1edbbe452","Type":"ContainerStarted","Data":"ff6b0dcdf93534c5925a826662058e4fc797ea96bfffdf12cb8e0bd9167453f9"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.891105 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.892599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" event={"ID":"0d066703-200f-472a-b768-f6aef5eb347f","Type":"ContainerStarted","Data":"19ff759288c9bf4a73a4790fdeb0a1fbb84f34e44a6811fd17abcdcf23a6d816"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.893709 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" event={"ID":"4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa","Type":"ContainerStarted","Data":"1fb0804c9d8895c85340606497c86ea36fd311822749e99456b4f50eee83b97e"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.895100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" event={"ID":"73e2c644-bdbd-4769-946a-4e2111a28326","Type":"ContainerStarted","Data":"99551518fde7f9dcaae69c12528539feab52a865ebcf9610c5af181a66924890"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.896141 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" event={"ID":"8ec4bb36-473c-4103-bfeb-10e8df206b9a","Type":"ContainerStarted","Data":"78435d9cf654b858c6e26a689303625ef8c386c4a4c0120cc26ba2c0f594b1b8"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.897112 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" event={"ID":"90753829-7cac-4f8f-8aa5-086430d0eafa","Type":"ContainerStarted","Data":"88d5cc3d97163b6d6c76322e586d3302220360622e2be2d9357a30d9989413f4"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.899939 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" event={"ID":"f8f4d668-e8ad-4c6c-9107-6221569d3079","Type":"ContainerStarted","Data":"c990d80fa6a18592a7903e1e81b6382edd2499c6e1cd944f809988354953c59c"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.901181 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" event={"ID":"e59747be-3214-43b9-b75b-88b8e7e71484","Type":"ContainerStarted","Data":"4e9a0e48b56a755b934f6ab3a55460da2993a1a9fde72845e69c9f244aa99128"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.902493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" event={"ID":"0039adb6-7c13-414b-bbd6-25e759da85b7","Type":"ContainerStarted","Data":"739c5602612e200b2ba67752d419e9f9a05cd28b4ef145d10e95f7340bf73c9f"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.903624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" event={"ID":"9d9feeae-ff51-432c-a4a4-e375d743f0b3","Type":"ContainerStarted","Data":"d7a84ad9cf22856da2db9d75cfb4cc4705ff362ce677c8a3ea32e653b04b9485"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.910737 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" event={"ID":"de58ceb7-b3dd-487f-95eb-48d02a0accc3","Type":"ContainerStarted","Data":"ec7f66d163b97fe23af88a55b5baf705b5a1a017dba0823948775adec4f0e57e"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.913930 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" event={"ID":"7d7e8ef8-065c-40c0-a396-915b7efdd1a0","Type":"ContainerStarted","Data":"794aad3db297cc43221a1f7667ad071c16fc714e5ee53bf59ec3bbb18b25621a"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.915989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" event={"ID":"45950a96-a521-4429-b7d1-71efa644a087","Type":"ContainerStarted","Data":"6d966b1faaf0a5e43c1155ae6764fd735067bcece2b02c7b66332a579d0257c1"} Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.961109 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.973272 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.978461 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.978864 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.978930 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:55.978913917 +0000 UTC m=+1035.488632883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.981691 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6"] Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.992635 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2"] Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.993869 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43459d05_1aac_46b1_b690_1b8c948bbb07.slice/crio-904cbd26ac57921a3e813a26c2b4a18a8279102a0f72f4b3d6c2caaed3b3b839 WatchSource:0}: Error finding container 904cbd26ac57921a3e813a26c2b4a18a8279102a0f72f4b3d6c2caaed3b3b839: Status 404 returned error can't find the container with id 904cbd26ac57921a3e813a26c2b4a18a8279102a0f72f4b3d6c2caaed3b3b839 Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.995704 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-796gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-ppltl_openstack-operators(cc9f95f5-a6fd-4638-989e-3dff592f5022): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.995781 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvj5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-c2cz2_openstack-operators(43459d05-1aac-46b1-b690-1b8c948bbb07): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:53 crc kubenswrapper[4895]: I0320 13:38:53.997068 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m"] Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.997121 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" podUID="cc9f95f5-a6fd-4638-989e-3dff592f5022" Mar 20 13:38:53 crc kubenswrapper[4895]: E0320 13:38:53.997155 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" podUID="43459d05-1aac-46b1-b690-1b8c948bbb07" Mar 20 13:38:53 crc kubenswrapper[4895]: W0320 13:38:53.999341 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c5c274_21be_4e53_99f7_d1ab4f352142.slice/crio-a1dcc523bf17ef0b0311df9fffd1fed692f4871bc82ed6d4c1c31828540d0940 WatchSource:0}: Error finding container a1dcc523bf17ef0b0311df9fffd1fed692f4871bc82ed6d4c1c31828540d0940: Status 404 returned error can't find the container with id a1dcc523bf17ef0b0311df9fffd1fed692f4871bc82ed6d4c1c31828540d0940 Mar 20 13:38:54 crc kubenswrapper[4895]: W0320 13:38:54.001801 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680cd993_89dd_47f4_8555_b49ff8293a76.slice/crio-eeb30e73cf33a7be85ee4f8000aff0c08f5e57b9ce3f0e5e3556ddfe8f1bd2fd WatchSource:0}: Error finding container eeb30e73cf33a7be85ee4f8000aff0c08f5e57b9ce3f0e5e3556ddfe8f1bd2fd: Status 404 returned error can't find the container with id eeb30e73cf33a7be85ee4f8000aff0c08f5e57b9ce3f0e5e3556ddfe8f1bd2fd Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.007214 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt"] Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.008781 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lprqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-f6m2m_openstack-operators(680cd993-89dd-47f4-8555-b49ff8293a76): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.009436 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sb72p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-6xwg6_openstack-operators(e9c5c274-21be-4e53-99f7-d1ab4f352142): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.010716 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" podUID="680cd993-89dd-47f4-8555-b49ff8293a76" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.010801 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" podUID="e9c5c274-21be-4e53-99f7-d1ab4f352142" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.014541 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d"] Mar 20 13:38:54 crc kubenswrapper[4895]: W0320 13:38:54.019054 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ce923f6_b8eb_4461_a222_0af773470e76.slice/crio-9954a6fd29ba98ac7cdbf54761267394a03378dffb88b43f454d6d8cbf37f164 WatchSource:0}: Error finding container 9954a6fd29ba98ac7cdbf54761267394a03378dffb88b43f454d6d8cbf37f164: Status 404 returned error can't find the container with id 9954a6fd29ba98ac7cdbf54761267394a03378dffb88b43f454d6d8cbf37f164 Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.027178 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbwlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-fbb6f4f4f-rbm9d_openstack-operators(4ce923f6-b8eb-4461-a222-0af773470e76): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.028549 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" podUID="4ce923f6-b8eb-4461-a222-0af773470e76" Mar 20 13:38:54 crc kubenswrapper[4895]: W0320 13:38:54.032176 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3b843f_4b33_455f_9d52_6a0267d370cb.slice/crio-57d2340c03af7e13c5ce6434132f613c9fd9efa83424ff8627c216d69e3bf5c3 WatchSource:0}: Error finding container 57d2340c03af7e13c5ce6434132f613c9fd9efa83424ff8627c216d69e3bf5c3: Status 404 returned error can't find the container with id 57d2340c03af7e13c5ce6434132f613c9fd9efa83424ff8627c216d69e3bf5c3 Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.038846 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j8n86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-8xlbt_openstack-operators(1d3b843f-4b33-455f-9d52-6a0267d370cb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.052809 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" podUID="1d3b843f-4b33-455f-9d52-6a0267d370cb" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.383663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.383902 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.384759 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:56.384735124 +0000 UTC m=+1035.894454100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.691149 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.691267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.691273 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.691334 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:56.691319305 +0000 UTC m=+1036.201038271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.691435 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.691483 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:56.691468289 +0000 UTC m=+1036.201187255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.925770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" event={"ID":"43459d05-1aac-46b1-b690-1b8c948bbb07","Type":"ContainerStarted","Data":"904cbd26ac57921a3e813a26c2b4a18a8279102a0f72f4b3d6c2caaed3b3b839"} Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.928749 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" podUID="43459d05-1aac-46b1-b690-1b8c948bbb07" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.932300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" event={"ID":"4ce923f6-b8eb-4461-a222-0af773470e76","Type":"ContainerStarted","Data":"9954a6fd29ba98ac7cdbf54761267394a03378dffb88b43f454d6d8cbf37f164"} Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.933697 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" podUID="4ce923f6-b8eb-4461-a222-0af773470e76" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.934401 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" event={"ID":"cc9f95f5-a6fd-4638-989e-3dff592f5022","Type":"ContainerStarted","Data":"a698618536c29ab5f481dbb62f7fe9883e6e0d8e8de571d43a54140cfd69a8a8"} Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.935326 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" podUID="cc9f95f5-a6fd-4638-989e-3dff592f5022" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.939193 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" event={"ID":"e9c5c274-21be-4e53-99f7-d1ab4f352142","Type":"ContainerStarted","Data":"a1dcc523bf17ef0b0311df9fffd1fed692f4871bc82ed6d4c1c31828540d0940"} Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.940472 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" podUID="e9c5c274-21be-4e53-99f7-d1ab4f352142" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.941372 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" event={"ID":"680cd993-89dd-47f4-8555-b49ff8293a76","Type":"ContainerStarted","Data":"eeb30e73cf33a7be85ee4f8000aff0c08f5e57b9ce3f0e5e3556ddfe8f1bd2fd"} Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.943853 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" event={"ID":"1d3b843f-4b33-455f-9d52-6a0267d370cb","Type":"ContainerStarted","Data":"57d2340c03af7e13c5ce6434132f613c9fd9efa83424ff8627c216d69e3bf5c3"} Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.945189 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" podUID="680cd993-89dd-47f4-8555-b49ff8293a76" Mar 20 13:38:54 crc kubenswrapper[4895]: E0320 13:38:54.945690 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" podUID="1d3b843f-4b33-455f-9d52-6a0267d370cb" Mar 20 13:38:54 crc kubenswrapper[4895]: I0320 13:38:54.946626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" event={"ID":"7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c","Type":"ContainerStarted","Data":"066e9b057d32c51a86415945ae4f00fc87b380b9f8a1ddea180057995d9e9efb"} Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.955561 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" podUID="4ce923f6-b8eb-4461-a222-0af773470e76" Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.955584 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" podUID="1d3b843f-4b33-455f-9d52-6a0267d370cb" Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.955632 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" podUID="cc9f95f5-a6fd-4638-989e-3dff592f5022" Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.958059 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" podUID="43459d05-1aac-46b1-b690-1b8c948bbb07" Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.958340 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" podUID="680cd993-89dd-47f4-8555-b49ff8293a76" Mar 20 13:38:55 crc kubenswrapper[4895]: E0320 13:38:55.959050 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" podUID="e9c5c274-21be-4e53-99f7-d1ab4f352142" Mar 20 13:38:56 crc kubenswrapper[4895]: I0320 13:38:56.018308 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.018574 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.018672 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:00.018648699 +0000 UTC m=+1039.528367735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: I0320 13:38:56.425244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.425363 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.425426 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:00.425412039 +0000 UTC m=+1039.935131005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: I0320 13:38:56.730115 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:56 crc kubenswrapper[4895]: I0320 13:38:56.730573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.730811 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.730867 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:00.730850601 +0000 UTC m=+1040.240569567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.730927 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:38:56 crc kubenswrapper[4895]: E0320 13:38:56.731029 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:00.731002975 +0000 UTC m=+1040.240721971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: I0320 13:39:00.088837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.089015 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.089082 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:08.089063071 +0000 UTC m=+1047.598782037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: I0320 13:39:00.494912 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.495240 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.495315 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:08.49529641 +0000 UTC m=+1048.005015376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: I0320 13:39:00.814366 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:00 crc kubenswrapper[4895]: I0320 13:39:00.814468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.814618 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.814635 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.814665 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:08.814651184 +0000 UTC m=+1048.324370150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:39:00 crc kubenswrapper[4895]: E0320 13:39:00.814730 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:08.814699826 +0000 UTC m=+1048.324418822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: I0320 13:39:08.117082 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.117294 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.117796 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert podName:e386d39f-7654-4d1d-84fc-6796309ac427 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:24.117774445 +0000 UTC m=+1063.627493411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert") pod "infra-operator-controller-manager-7b9c774f96-dnmhw" (UID: "e386d39f-7654-4d1d-84fc-6796309ac427") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: I0320 13:39:08.523241 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.523443 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.523522 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert podName:de5694d4-a796-46ee-9f84-4b9d35475f27 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:24.523499852 +0000 UTC m=+1064.033218888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" (UID: "de5694d4-a796-46ee-9f84-4b9d35475f27") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: I0320 13:39:08.827494 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.827710 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: I0320 13:39:08.827784 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.827811 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:24.827782454 +0000 UTC m=+1064.337501510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "metrics-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.827982 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:39:08 crc kubenswrapper[4895]: E0320 13:39:08.828055 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs podName:27b2849d-9127-4c6b-a83f-a1ce0af6cac8 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:24.82803237 +0000 UTC m=+1064.337751366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-c6nbz" (UID: "27b2849d-9127-4c6b-a83f-a1ce0af6cac8") : secret "webhook-server-cert" not found Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.095631 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" event={"ID":"810c2ef6-f5e6-4003-b01e-76e1edbbe452","Type":"ContainerStarted","Data":"7b8e2d2bc652954670d9b20d715afc5f34178dccfe15bf1015554f019f4d4540"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.096183 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.097996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" event={"ID":"f8f4d668-e8ad-4c6c-9107-6221569d3079","Type":"ContainerStarted","Data":"8cd4815a0b15db7800a36c132ba2d16c80ca11dcda32320982deef3a67a84621"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.098122 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.099629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" event={"ID":"e59747be-3214-43b9-b75b-88b8e7e71484","Type":"ContainerStarted","Data":"a8969b587068b635fd87b4abce0c7f855c8b5cb50a79dc13532117de6e52abf3"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.100170 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.103299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" event={"ID":"9d9feeae-ff51-432c-a4a4-e375d743f0b3","Type":"ContainerStarted","Data":"a03967783c505a5654c9f8d0dd0943a4728553560157a9aa57bb03efa4f65658"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.103537 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.117077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" event={"ID":"45950a96-a521-4429-b7d1-71efa644a087","Type":"ContainerStarted","Data":"b581e19599b8991ad04649f5c28ec65ae25cf787f5d3769137a11762780cd82c"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.117854 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.119075 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" podStartSLOduration=3.783196323 podStartE2EDuration="19.119056363s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.093411535 +0000 UTC m=+1032.603130501" lastFinishedPulling="2026-03-20 13:39:08.429271575 +0000 UTC m=+1047.938990541" observedRunningTime="2026-03-20 13:39:10.11284102 +0000 UTC m=+1049.622559986" watchObservedRunningTime="2026-03-20 13:39:10.119056363 +0000 UTC m=+1049.628775329" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.124133 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" event={"ID":"de58ceb7-b3dd-487f-95eb-48d02a0accc3","Type":"ContainerStarted","Data":"35273bc3e6c730e4e9c0d7b0997c20534e699b179aef2a1df83480f51c19feb4"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.124249 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.128581 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" event={"ID":"7d7e8ef8-065c-40c0-a396-915b7efdd1a0","Type":"ContainerStarted","Data":"1682c9dea7e8dbcc1a4b328cc9e4358001702879ac8c8684ba0d3118aed70e64"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.128743 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.141491 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" event={"ID":"7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c","Type":"ContainerStarted","Data":"d63d52a3181d1211932cb3c5699cfe3ea1b24bedf57ec93f26d3390490a4a898"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.141746 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.150491 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" podStartSLOduration=3.044232754 podStartE2EDuration="18.150476419s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.842112726 +0000 UTC m=+1033.351831692" lastFinishedPulling="2026-03-20 13:39:08.948356391 +0000 UTC m=+1048.458075357" observedRunningTime="2026-03-20 13:39:10.137704724 +0000 UTC m=+1049.647423690" watchObservedRunningTime="2026-03-20 13:39:10.150476419 +0000 UTC m=+1049.660195375" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.168589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" event={"ID":"73e2c644-bdbd-4769-946a-4e2111a28326","Type":"ContainerStarted","Data":"60b2cda65b81e4e888735c769e629cd1350dde0371370ddbb2d0d01056f44825"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.169221 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.173985 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" podStartSLOduration=4.003106957 podStartE2EDuration="19.173972459s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.718105053 +0000 UTC m=+1033.227824019" lastFinishedPulling="2026-03-20 13:39:08.888970515 +0000 UTC m=+1048.398689521" observedRunningTime="2026-03-20 13:39:10.168620547 +0000 UTC m=+1049.678339513" watchObservedRunningTime="2026-03-20 13:39:10.173972459 +0000 UTC m=+1049.683691425" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.187205 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" event={"ID":"0039adb6-7c13-414b-bbd6-25e759da85b7","Type":"ContainerStarted","Data":"902f33ffb7ba928a4106e5f8e4d310599a53e9829b5b9b88670833ebcf6dfdd4"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.187600 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.190222 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" event={"ID":"8ec4bb36-473c-4103-bfeb-10e8df206b9a","Type":"ContainerStarted","Data":"899c61f0f3b9976c55fd47398af9d6f326feea1a70e3ca41f7b7119dbbbfaeeb"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.191053 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.197011 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" podStartSLOduration=3.392741981 podStartE2EDuration="18.196991037s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.624279311 +0000 UTC m=+1033.133998277" lastFinishedPulling="2026-03-20 13:39:08.428528367 +0000 UTC m=+1047.938247333" observedRunningTime="2026-03-20 13:39:10.19144163 +0000 UTC m=+1049.701160606" watchObservedRunningTime="2026-03-20 13:39:10.196991037 +0000 UTC m=+1049.706710003" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.208670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" event={"ID":"0d066703-200f-472a-b768-f6aef5eb347f","Type":"ContainerStarted","Data":"ec83aa4e0c31a32cb5e810b603155d59d5f78d7967f75c750b150a08f9ea8475"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.209109 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.211980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" event={"ID":"90753829-7cac-4f8f-8aa5-086430d0eafa","Type":"ContainerStarted","Data":"c5eb47649f9539833a8fc55e4d5758aaaa9bd49c1c91376528eb03b597b3c038"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.212331 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.227688 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" podStartSLOduration=3.759987616 podStartE2EDuration="19.227672115s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.419956033 +0000 UTC m=+1032.929674999" lastFinishedPulling="2026-03-20 13:39:08.887640512 +0000 UTC m=+1048.397359498" observedRunningTime="2026-03-20 13:39:10.224092406 +0000 UTC m=+1049.733811372" watchObservedRunningTime="2026-03-20 13:39:10.227672115 +0000 UTC m=+1049.737391071" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.241953 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" event={"ID":"4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa","Type":"ContainerStarted","Data":"9525c4fb8d776429a236a105f29b2ab538a9daa47350cd69f44622204057e238"} Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.257412 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" podStartSLOduration=3.170926525 podStartE2EDuration="18.257359888s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.86279469 +0000 UTC m=+1033.372513656" lastFinishedPulling="2026-03-20 13:39:08.949228053 +0000 UTC m=+1048.458947019" observedRunningTime="2026-03-20 13:39:10.250033177 +0000 UTC m=+1049.759752143" watchObservedRunningTime="2026-03-20 13:39:10.257359888 +0000 UTC m=+1049.767078854" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.302319 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" podStartSLOduration=4.114917905 podStartE2EDuration="19.302299907s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.700288651 +0000 UTC m=+1033.210007617" lastFinishedPulling="2026-03-20 13:39:08.887670623 +0000 UTC m=+1048.397389619" observedRunningTime="2026-03-20 13:39:10.281830781 +0000 UTC m=+1049.791549747" watchObservedRunningTime="2026-03-20 13:39:10.302299907 +0000 UTC m=+1049.812018873" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.348315 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" podStartSLOduration=3.45769417 podStartE2EDuration="19.348294483s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.055565264 +0000 UTC m=+1032.565284230" lastFinishedPulling="2026-03-20 13:39:08.946165577 +0000 UTC m=+1048.455884543" observedRunningTime="2026-03-20 13:39:10.343414122 +0000 UTC m=+1049.853133118" watchObservedRunningTime="2026-03-20 13:39:10.348294483 +0000 UTC m=+1049.858013449" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.348422 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" podStartSLOduration=3.375891445 podStartE2EDuration="18.348416766s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.973132703 +0000 UTC m=+1033.482851669" lastFinishedPulling="2026-03-20 13:39:08.945658024 +0000 UTC m=+1048.455376990" observedRunningTime="2026-03-20 13:39:10.300849631 +0000 UTC m=+1049.810568597" watchObservedRunningTime="2026-03-20 13:39:10.348416766 +0000 UTC m=+1049.858135732" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.389783 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" podStartSLOduration=3.120808246 podStartE2EDuration="18.389763477s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.675811772 +0000 UTC m=+1033.185530738" lastFinishedPulling="2026-03-20 13:39:08.944767003 +0000 UTC m=+1048.454485969" observedRunningTime="2026-03-20 13:39:10.387247905 +0000 UTC m=+1049.896966871" watchObservedRunningTime="2026-03-20 13:39:10.389763477 +0000 UTC m=+1049.899482433" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.413783 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" podStartSLOduration=3.140029371 podStartE2EDuration="18.413763159s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.675507905 +0000 UTC m=+1033.185226871" lastFinishedPulling="2026-03-20 13:39:08.949241653 +0000 UTC m=+1048.458960659" observedRunningTime="2026-03-20 13:39:10.410805786 +0000 UTC m=+1049.920524752" watchObservedRunningTime="2026-03-20 13:39:10.413763159 +0000 UTC m=+1049.923482135" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.446326 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" podStartSLOduration=3.316011468 podStartE2EDuration="18.446307543s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.859928689 +0000 UTC m=+1033.369647655" lastFinishedPulling="2026-03-20 13:39:08.990224764 +0000 UTC m=+1048.499943730" observedRunningTime="2026-03-20 13:39:10.435875175 +0000 UTC m=+1049.945594141" watchObservedRunningTime="2026-03-20 13:39:10.446307543 +0000 UTC m=+1049.956026509" Mar 20 13:39:10 crc kubenswrapper[4895]: I0320 13:39:10.466197 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" podStartSLOduration=3.4403244969999998 podStartE2EDuration="18.466179253s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.86277566 +0000 UTC m=+1033.372494616" lastFinishedPulling="2026-03-20 13:39:08.888630406 +0000 UTC m=+1048.398349372" observedRunningTime="2026-03-20 13:39:10.465715892 +0000 UTC m=+1049.975434868" watchObservedRunningTime="2026-03-20 13:39:10.466179253 +0000 UTC m=+1049.975898219" Mar 20 13:39:11 crc kubenswrapper[4895]: I0320 13:39:11.270438 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.314779 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" event={"ID":"e9c5c274-21be-4e53-99f7-d1ab4f352142","Type":"ContainerStarted","Data":"f7c121fb7a663a61f0f291080df14542512870d6b02282b844c3f11a9f6e87ce"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.315195 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.317083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" event={"ID":"680cd993-89dd-47f4-8555-b49ff8293a76","Type":"ContainerStarted","Data":"430dd226b5e1c7b26dacabe138dd94ffa953cc6db0315c00cc1467b4cd10ac0c"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.317280 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.318846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" event={"ID":"1d3b843f-4b33-455f-9d52-6a0267d370cb","Type":"ContainerStarted","Data":"8c9bf78f38d1f07b8781b5ff56144935421f79c1228f93f1dad2795bd38e2056"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.319003 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.320629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" event={"ID":"43459d05-1aac-46b1-b690-1b8c948bbb07","Type":"ContainerStarted","Data":"0fd0239a808ebc04fd16506b1040549517885ab3a3f2fb0858444e7dc2b18966"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.322563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" event={"ID":"4ce923f6-b8eb-4461-a222-0af773470e76","Type":"ContainerStarted","Data":"c0e616a009294295fdca191f7a116d30ec658a323d7d2122c940ba13cb701f70"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.322848 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.323799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" event={"ID":"cc9f95f5-a6fd-4638-989e-3dff592f5022","Type":"ContainerStarted","Data":"93bfccf831c8c6efc7dde37f3a707b6ddf5a624e12cabfb994cde3b9d1b5a208"} Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.323947 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.329368 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" podStartSLOduration=11.576284218 podStartE2EDuration="26.329358232s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.675495254 +0000 UTC m=+1033.185214220" lastFinishedPulling="2026-03-20 13:39:08.428569268 +0000 UTC m=+1047.938288234" observedRunningTime="2026-03-20 13:39:10.504059028 +0000 UTC m=+1050.013777994" watchObservedRunningTime="2026-03-20 13:39:17.329358232 +0000 UTC m=+1056.839077198" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.332623 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" podStartSLOduration=2.7754203349999997 podStartE2EDuration="25.332596702s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:54.009339092 +0000 UTC m=+1033.519058058" lastFinishedPulling="2026-03-20 13:39:16.566515459 +0000 UTC m=+1056.076234425" observedRunningTime="2026-03-20 13:39:17.32883264 +0000 UTC m=+1056.838551606" watchObservedRunningTime="2026-03-20 13:39:17.332596702 +0000 UTC m=+1056.842315668" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.357663 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c2cz2" podStartSLOduration=2.784818756 podStartE2EDuration="25.357647451s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.995635222 +0000 UTC m=+1033.505354188" lastFinishedPulling="2026-03-20 13:39:16.568463927 +0000 UTC m=+1056.078182883" observedRunningTime="2026-03-20 13:39:17.351821947 +0000 UTC m=+1056.861540913" watchObservedRunningTime="2026-03-20 13:39:17.357647451 +0000 UTC m=+1056.867366407" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.372507 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" podStartSLOduration=2.748861734 podStartE2EDuration="25.372492148s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:54.027048003 +0000 UTC m=+1033.536766969" lastFinishedPulling="2026-03-20 13:39:16.650678417 +0000 UTC m=+1056.160397383" observedRunningTime="2026-03-20 13:39:17.369668728 +0000 UTC m=+1056.879387694" watchObservedRunningTime="2026-03-20 13:39:17.372492148 +0000 UTC m=+1056.882211114" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.401328 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" podStartSLOduration=2.843454045 podStartE2EDuration="25.401310428s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:54.008618575 +0000 UTC m=+1033.518337541" lastFinishedPulling="2026-03-20 13:39:16.566474958 +0000 UTC m=+1056.076193924" observedRunningTime="2026-03-20 13:39:17.396425968 +0000 UTC m=+1056.906144924" watchObservedRunningTime="2026-03-20 13:39:17.401310428 +0000 UTC m=+1056.911029394" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.413216 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" podStartSLOduration=2.885438357 podStartE2EDuration="25.413202783s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:54.038708052 +0000 UTC m=+1033.548427018" lastFinishedPulling="2026-03-20 13:39:16.566472478 +0000 UTC m=+1056.076191444" observedRunningTime="2026-03-20 13:39:17.411799488 +0000 UTC m=+1056.921518454" watchObservedRunningTime="2026-03-20 13:39:17.413202783 +0000 UTC m=+1056.922921749" Mar 20 13:39:17 crc kubenswrapper[4895]: I0320 13:39:17.429944 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" podStartSLOduration=2.858882435 podStartE2EDuration="25.429914365s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.99553928 +0000 UTC m=+1033.505258256" lastFinishedPulling="2026-03-20 13:39:16.56657122 +0000 UTC m=+1056.076290186" observedRunningTime="2026-03-20 13:39:17.42487581 +0000 UTC m=+1056.934594776" watchObservedRunningTime="2026-03-20 13:39:17.429914365 +0000 UTC m=+1056.939633331" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.200959 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h6qcp" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.213283 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-8tttj" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.286680 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6rrsx" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.314997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-vvwnk" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.411829 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f4sjs" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.418994 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-x55wz" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.479188 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-nsh2d" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.586595 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-8nddx" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.639063 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqdrg" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.757757 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-pk85v" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.805591 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-c2kgq" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.830258 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xvkkg" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.856420 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7gt5d" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.936606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-br56m" Mar 20 13:39:22 crc kubenswrapper[4895]: I0320 13:39:22.970627 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-f6m2m" Mar 20 13:39:23 crc kubenswrapper[4895]: I0320 13:39:23.086342 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6xwg6" Mar 20 13:39:23 crc kubenswrapper[4895]: I0320 13:39:23.299383 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-rbm9d" Mar 20 13:39:23 crc kubenswrapper[4895]: I0320 13:39:23.305516 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-ppltl" Mar 20 13:39:23 crc kubenswrapper[4895]: I0320 13:39:23.331160 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-8xlbt" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.171860 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.181030 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e386d39f-7654-4d1d-84fc-6796309ac427-cert\") pod \"infra-operator-controller-manager-7b9c774f96-dnmhw\" (UID: \"e386d39f-7654-4d1d-84fc-6796309ac427\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.239724 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.578119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.582079 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5694d4-a796-46ee-9f84-4b9d35475f27-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cm5d4\" (UID: \"de5694d4-a796-46ee-9f84-4b9d35475f27\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.753335 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw"] Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.796922 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.881433 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.881517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.886253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:24 crc kubenswrapper[4895]: I0320 13:39:24.887438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27b2849d-9127-4c6b-a83f-a1ce0af6cac8-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-c6nbz\" (UID: \"27b2849d-9127-4c6b-a83f-a1ce0af6cac8\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:25 crc kubenswrapper[4895]: I0320 13:39:25.020022 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4"] Mar 20 13:39:25 crc kubenswrapper[4895]: W0320 13:39:25.025489 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5694d4_a796_46ee_9f84_4b9d35475f27.slice/crio-b4a7f5d51dbd65967c706e5f65e948f4b478282a635e948a2ccf89c8196b8835 WatchSource:0}: Error finding container b4a7f5d51dbd65967c706e5f65e948f4b478282a635e948a2ccf89c8196b8835: Status 404 returned error can't find the container with id b4a7f5d51dbd65967c706e5f65e948f4b478282a635e948a2ccf89c8196b8835 Mar 20 13:39:25 crc kubenswrapper[4895]: I0320 13:39:25.141288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:25 crc kubenswrapper[4895]: I0320 13:39:25.397858 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" event={"ID":"de5694d4-a796-46ee-9f84-4b9d35475f27","Type":"ContainerStarted","Data":"b4a7f5d51dbd65967c706e5f65e948f4b478282a635e948a2ccf89c8196b8835"} Mar 20 13:39:25 crc kubenswrapper[4895]: I0320 13:39:25.399152 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" event={"ID":"e386d39f-7654-4d1d-84fc-6796309ac427","Type":"ContainerStarted","Data":"9c33a83318b968fa4e7749d8fd1da6691eeb970cc91d52dae709a3c8f6079753"} Mar 20 13:39:25 crc kubenswrapper[4895]: I0320 13:39:25.665733 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz"] Mar 20 13:39:26 crc kubenswrapper[4895]: I0320 13:39:26.407568 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" event={"ID":"27b2849d-9127-4c6b-a83f-a1ce0af6cac8","Type":"ContainerStarted","Data":"af597a9602bda21eecc9a90c8d478040b0d6ee9f227f92869f90dd2f811d54f3"} Mar 20 13:39:26 crc kubenswrapper[4895]: I0320 13:39:26.407921 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:26 crc kubenswrapper[4895]: I0320 13:39:26.407934 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" event={"ID":"27b2849d-9127-4c6b-a83f-a1ce0af6cac8","Type":"ContainerStarted","Data":"ceccd13839284252a025237f8ae3e330eca81b7985de5ccd157304284ba20bc6"} Mar 20 13:39:26 crc kubenswrapper[4895]: I0320 13:39:26.438485 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" podStartSLOduration=34.438462271 podStartE2EDuration="34.438462271s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:39:26.431711454 +0000 UTC m=+1065.941430440" watchObservedRunningTime="2026-03-20 13:39:26.438462271 +0000 UTC m=+1065.948181237" Mar 20 13:39:27 crc kubenswrapper[4895]: I0320 13:39:27.418509 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" event={"ID":"de5694d4-a796-46ee-9f84-4b9d35475f27","Type":"ContainerStarted","Data":"6d5c6f3378008179288af5b9ed0d8e7cbc8c975216e1214b05a25fcbf407687b"} Mar 20 13:39:27 crc kubenswrapper[4895]: I0320 13:39:27.418892 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:28 crc kubenswrapper[4895]: I0320 13:39:28.439009 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" event={"ID":"e386d39f-7654-4d1d-84fc-6796309ac427","Type":"ContainerStarted","Data":"7b943987a5bef521dc5d8777fc6b83960c432a815f44d0123a8e61ca419b0149"} Mar 20 13:39:28 crc kubenswrapper[4895]: I0320 13:39:28.478668 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" podStartSLOduration=34.282786928 podStartE2EDuration="36.478634359s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:39:25.028255756 +0000 UTC m=+1064.537974712" lastFinishedPulling="2026-03-20 13:39:27.224103177 +0000 UTC m=+1066.733822143" observedRunningTime="2026-03-20 13:39:27.455148181 +0000 UTC m=+1066.964867157" watchObservedRunningTime="2026-03-20 13:39:28.478634359 +0000 UTC m=+1067.988353355" Mar 20 13:39:28 crc kubenswrapper[4895]: I0320 13:39:28.480207 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" podStartSLOduration=35.025840804 podStartE2EDuration="37.480192828s" podCreationTimestamp="2026-03-20 13:38:51 +0000 UTC" firstStartedPulling="2026-03-20 13:39:24.7647849 +0000 UTC m=+1064.274503876" lastFinishedPulling="2026-03-20 13:39:27.219136934 +0000 UTC m=+1066.728855900" observedRunningTime="2026-03-20 13:39:28.460687256 +0000 UTC m=+1067.970406252" watchObservedRunningTime="2026-03-20 13:39:28.480192828 +0000 UTC m=+1067.989911834" Mar 20 13:39:29 crc kubenswrapper[4895]: I0320 13:39:29.450797 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:34 crc kubenswrapper[4895]: I0320 13:39:34.249074 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-dnmhw" Mar 20 13:39:34 crc kubenswrapper[4895]: I0320 13:39:34.807980 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cm5d4" Mar 20 13:39:35 crc kubenswrapper[4895]: I0320 13:39:35.153875 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-c6nbz" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.815597 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.818063 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.820240 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.820521 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.821195 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.821232 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4xlkm" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.839027 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.869797 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.873675 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.876038 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.886939 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.902500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldwb6\" (UniqueName: \"kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:50 crc kubenswrapper[4895]: I0320 13:39:50.902561 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.003365 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.003539 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.003632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldwb6\" (UniqueName: \"kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.003662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g277z\" (UniqueName: \"kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.003690 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.004353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.027456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldwb6\" (UniqueName: \"kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6\") pod \"dnsmasq-dns-675f4bcbfc-g2x4n\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.105076 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.105450 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g277z\" (UniqueName: \"kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.105557 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.105919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.106262 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.120496 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g277z\" (UniqueName: \"kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z\") pod \"dnsmasq-dns-78dd6ddcc-52rlf\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.137416 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.187259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.590442 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.639015 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:39:51 crc kubenswrapper[4895]: I0320 13:39:51.640134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" event={"ID":"5e11919e-55bb-43f6-8613-abb6d70088f4","Type":"ContainerStarted","Data":"83d6fd0a0107a81543055125d5736fad16cc72a718db362b0edca0467698799a"} Mar 20 13:39:51 crc kubenswrapper[4895]: W0320 13:39:51.642261 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6b9801_a5b2_4f9c_99a6_aec4c81bd381.slice/crio-02a27afbe1d0a6823803b20308e5d44057c51808ea7cca781d2963604b7ab8d6 WatchSource:0}: Error finding container 02a27afbe1d0a6823803b20308e5d44057c51808ea7cca781d2963604b7ab8d6: Status 404 returned error can't find the container with id 02a27afbe1d0a6823803b20308e5d44057c51808ea7cca781d2963604b7ab8d6 Mar 20 13:39:52 crc kubenswrapper[4895]: I0320 13:39:52.651204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" event={"ID":"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381","Type":"ContainerStarted","Data":"02a27afbe1d0a6823803b20308e5d44057c51808ea7cca781d2963604b7ab8d6"} Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.677562 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.695996 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.698302 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.724802 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.847307 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.847352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.847440 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94kp\" (UniqueName: \"kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.946979 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.948737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.948776 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.948815 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94kp\" (UniqueName: \"kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.950232 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.950897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.973655 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.974796 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:53 crc kubenswrapper[4895]: I0320 13:39:53.986191 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.002407 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94kp\" (UniqueName: \"kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp\") pod \"dnsmasq-dns-5ccc8479f9-gz6jx\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.028910 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.050642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjgm\" (UniqueName: \"kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.050742 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.050773 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.152892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.152991 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.153069 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjgm\" (UniqueName: \"kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.153791 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.154363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.174182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjgm\" (UniqueName: \"kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm\") pod \"dnsmasq-dns-57d769cc4f-zhqtz\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.326462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.837598 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.839197 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.845277 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.870792 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.870895 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.870816 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.871332 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9hmzl" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.871516 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.871647 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.871719 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.967822 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.967868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.967902 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.967938 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.967976 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968003 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968017 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968039 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88hx\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:54 crc kubenswrapper[4895]: I0320 13:39:54.968108 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069507 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069651 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069668 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069706 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88hx\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069761 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069778 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.069799 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.071474 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.072427 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.072963 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.073640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.076183 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.076238 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1ed9a52ad5555824dc3a173b342b86aa900d19d5be3872519e1faa195be79d6d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.083984 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.085697 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.085823 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.086017 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.087732 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.098163 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88hx\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.147652 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.151258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.154832 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.154932 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.155139 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7plls" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.155442 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.155730 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.155819 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.155832 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.162563 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.191563 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.194286 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272678 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272705 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272745 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272804 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272881 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272909 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272966 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.272984 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622j2\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.273008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.374564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.374637 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.374704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.374898 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.374979 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375233 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622j2\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.375284 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.376293 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.376821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.379924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.379975 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.381456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.382679 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.382724 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf58efcf1f086a4b5a46ed60249900a178edf090a90489330013a00504335efb/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.388113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.389083 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.389485 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.395878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.401363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622j2\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.428992 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " pod="openstack/rabbitmq-server-0" Mar 20 13:39:55 crc kubenswrapper[4895]: I0320 13:39:55.496858 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.277240 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.278789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.285196 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.285291 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-59zmn" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.285654 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.285759 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.287965 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.289970 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388349 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388497 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388609 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqd6\" (UniqueName: \"kubernetes.io/projected/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kube-api-access-swqd6\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.388858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489765 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489826 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqd6\" (UniqueName: \"kubernetes.io/projected/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kube-api-access-swqd6\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.489858 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.491509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kolla-config\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.491584 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.491671 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.491692 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b3c4f62-dc8a-49bd-b97e-d57133678e19-config-data-default\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.494824 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.494869 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc99c3feb620c17a7e13284f8b90eccd39aa50365fe00134a0f0206fcb75e14f/globalmount\"" pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.495534 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.495706 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3c4f62-dc8a-49bd-b97e-d57133678e19-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.520122 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqd6\" (UniqueName: \"kubernetes.io/projected/6b3c4f62-dc8a-49bd-b97e-d57133678e19-kube-api-access-swqd6\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.520911 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49449a38-ef33-40dd-bb05-59ebf253ea20\") pod \"openstack-galera-0\" (UID: \"6b3c4f62-dc8a-49bd-b97e-d57133678e19\") " pod="openstack/openstack-galera-0" Mar 20 13:39:56 crc kubenswrapper[4895]: I0320 13:39:56.604070 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.657179 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.659996 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.664284 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.664993 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.665758 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rw58z" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.666122 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.673012 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809158 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809200 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809540 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.809579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf6n6\" (UniqueName: \"kubernetes.io/projected/c851b618-6bf5-4291-ae40-20ed962dfe46-kube-api-access-jf6n6\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.910896 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.910965 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911013 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911052 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911129 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911190 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf6n6\" (UniqueName: \"kubernetes.io/projected/c851b618-6bf5-4291-ae40-20ed962dfe46-kube-api-access-jf6n6\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911293 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.911656 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.912589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.914917 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.919415 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.920018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c851b618-6bf5-4291-ae40-20ed962dfe46-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.920626 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.920815 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8821957fd6657f63193d73200a0c322aed0df0669c6cd6221e8914d5d0e322f1/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.929762 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c851b618-6bf5-4291-ae40-20ed962dfe46-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.941815 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf6n6\" (UniqueName: \"kubernetes.io/projected/c851b618-6bf5-4291-ae40-20ed962dfe46-kube-api-access-jf6n6\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.964918 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b93c5a7-28b4-4304-a96f-3e38b665d314\") pod \"openstack-cell1-galera-0\" (UID: \"c851b618-6bf5-4291-ae40-20ed962dfe46\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:57 crc kubenswrapper[4895]: I0320 13:39:57.997293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.081959 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.083370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.085818 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2sgdr" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.086028 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.090572 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.096727 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.114268 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kolla-config\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.114314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-config-data\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.114353 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.114382 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62dm\" (UniqueName: \"kubernetes.io/projected/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kube-api-access-j62dm\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.114419 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.217369 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kolla-config\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.218223 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-config-data\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.218291 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.218329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j62dm\" (UniqueName: \"kubernetes.io/projected/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kube-api-access-j62dm\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.218372 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.218061 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kolla-config\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.219663 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c64e6c1-1601-4c6d-9cfe-2287e9147576-config-data\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.221103 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.222862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c64e6c1-1601-4c6d-9cfe-2287e9147576-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.232344 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62dm\" (UniqueName: \"kubernetes.io/projected/7c64e6c1-1601-4c6d-9cfe-2287e9147576-kube-api-access-j62dm\") pod \"memcached-0\" (UID: \"7c64e6c1-1601-4c6d-9cfe-2287e9147576\") " pod="openstack/memcached-0" Mar 20 13:39:58 crc kubenswrapper[4895]: I0320 13:39:58.402925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.081621 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.082773 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.084830 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4w9gv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.098033 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.149436 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-4rshv"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.151523 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.155779 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.156635 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.156805 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.162013 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-4rshv"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.259903 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrj4l\" (UniqueName: \"kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l\") pod \"auto-csr-approver-29566900-4rshv\" (UID: \"f341d72e-a04d-4f58-a7f9-bed0b19710ae\") " pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.260283 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhb6v\" (UniqueName: \"kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v\") pod \"kube-state-metrics-0\" (UID: \"9f27bbad-8a84-4902-8349-8c8724552442\") " pod="openstack/kube-state-metrics-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.361983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrj4l\" (UniqueName: \"kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l\") pod \"auto-csr-approver-29566900-4rshv\" (UID: \"f341d72e-a04d-4f58-a7f9-bed0b19710ae\") " pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.362056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhb6v\" (UniqueName: \"kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v\") pod \"kube-state-metrics-0\" (UID: \"9f27bbad-8a84-4902-8349-8c8724552442\") " pod="openstack/kube-state-metrics-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.383710 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhb6v\" (UniqueName: \"kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v\") pod \"kube-state-metrics-0\" (UID: \"9f27bbad-8a84-4902-8349-8c8724552442\") " pod="openstack/kube-state-metrics-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.384032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrj4l\" (UniqueName: \"kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l\") pod \"auto-csr-approver-29566900-4rshv\" (UID: \"f341d72e-a04d-4f58-a7f9-bed0b19710ae\") " pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.410980 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.493907 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.691096 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.692918 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.695224 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.695292 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.695438 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.695463 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-4557f" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.695675 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.719403 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775206 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775292 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775510 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775568 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.775646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrs8\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-kube-api-access-mzrs8\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.876617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.876708 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.876738 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.877446 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.877887 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.878012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.878049 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.878175 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrs8\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-kube-api-access-mzrs8\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.880981 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.881441 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.882545 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.884360 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d413e49a-6f03-44fc-87bf-f6b71efac9ad-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.884961 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:00 crc kubenswrapper[4895]: I0320 13:40:00.895282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrs8\" (UniqueName: \"kubernetes.io/projected/d413e49a-6f03-44fc-87bf-f6b71efac9ad-kube-api-access-mzrs8\") pod \"alertmanager-metric-storage-0\" (UID: \"d413e49a-6f03-44fc-87bf-f6b71efac9ad\") " pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.013238 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.695854 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.699008 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.702385 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.702907 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-phbvs" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.703060 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.703738 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.704013 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.704061 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.708385 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.713992 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.716873 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892335 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp8qq\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892628 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892715 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892792 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892813 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892957 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.892978 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.994567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.994617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.994636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.994691 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995232 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995252 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995598 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995778 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995811 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp8qq\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995864 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.995923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.996458 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.998024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:01 crc kubenswrapper[4895]: I0320 13:40:01.998468 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.000422 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.000460 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ed9a86cf4c4d51ee7a5816741a7d45729f9ae3892a1ed9810e6048d991f055dd/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.007084 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.009386 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.010144 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.012856 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp8qq\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.054100 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:02 crc kubenswrapper[4895]: I0320 13:40:02.328146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.855203 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4phvm"] Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.856303 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.858800 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-28fhv" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.858804 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.869458 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4phvm"] Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.872749 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.899420 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mvskb"] Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.901439 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:03 crc kubenswrapper[4895]: I0320 13:40:03.922850 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mvskb"] Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.023707 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.025304 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.029731 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.029961 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.030240 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-txmtg" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.030414 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.030550 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.033932 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-lib\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.033996 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-scripts\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034044 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whx57\" (UniqueName: \"kubernetes.io/projected/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-kube-api-access-whx57\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034107 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-log\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034252 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-run\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034299 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjvvr\" (UniqueName: \"kubernetes.io/projected/f0db633f-39ca-4915-ab69-a17d9140e31b-kube-api-access-hjvvr\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-combined-ca-bundle\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034437 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034461 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-etc-ovs\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034523 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0db633f-39ca-4915-ab69-a17d9140e31b-scripts\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-ovn-controller-tls-certs\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.034600 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-log-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.036234 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.137010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-combined-ca-bundle\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.137059 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.137550 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-etc-ovs\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142033 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.137078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-etc-ovs\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142198 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142298 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0db633f-39ca-4915-ab69-a17d9140e31b-scripts\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142340 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-ovn-controller-tls-certs\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142365 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-log-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142454 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142483 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-lib\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142507 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmwb\" (UniqueName: \"kubernetes.io/projected/af691a5e-1267-46ec-9d39-f4fa047a1741-kube-api-access-8kmwb\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-scripts\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142591 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whx57\" (UniqueName: \"kubernetes.io/projected/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-kube-api-access-whx57\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-log\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142780 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142813 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142845 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142871 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142898 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-run\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.142957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjvvr\" (UniqueName: \"kubernetes.io/projected/f0db633f-39ca-4915-ab69-a17d9140e31b-kube-api-access-hjvvr\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.143016 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-config\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.143438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-log\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.143517 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-run\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.143531 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-run\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.143655 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0db633f-39ca-4915-ab69-a17d9140e31b-var-log-ovn\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.144935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-var-lib\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.146165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0db633f-39ca-4915-ab69-a17d9140e31b-scripts\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.146532 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-scripts\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.147969 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-combined-ca-bundle\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.153788 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0db633f-39ca-4915-ab69-a17d9140e31b-ovn-controller-tls-certs\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.160243 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjvvr\" (UniqueName: \"kubernetes.io/projected/f0db633f-39ca-4915-ab69-a17d9140e31b-kube-api-access-hjvvr\") pod \"ovn-controller-4phvm\" (UID: \"f0db633f-39ca-4915-ab69-a17d9140e31b\") " pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.164502 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whx57\" (UniqueName: \"kubernetes.io/projected/ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9-kube-api-access-whx57\") pod \"ovn-controller-ovs-mvskb\" (UID: \"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9\") " pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.174199 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.216313 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244256 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244359 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244378 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmwb\" (UniqueName: \"kubernetes.io/projected/af691a5e-1267-46ec-9d39-f4fa047a1741-kube-api-access-8kmwb\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244480 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244505 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.244573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-config\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.245292 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-config\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.246222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af691a5e-1267-46ec-9d39-f4fa047a1741-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.247031 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.251242 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.257911 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.257958 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5f348b14d77d6d5e616b401269562d3ce28fcdf4902c6509ce46d5c5cb5783d/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.263555 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmwb\" (UniqueName: \"kubernetes.io/projected/af691a5e-1267-46ec-9d39-f4fa047a1741-kube-api-access-8kmwb\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.263926 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.268049 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af691a5e-1267-46ec-9d39-f4fa047a1741-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.297214 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f6c0576-e315-460f-b532-141ca0ef8f3e\") pod \"ovsdbserver-nb-0\" (UID: \"af691a5e-1267-46ec-9d39-f4fa047a1741\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:04 crc kubenswrapper[4895]: I0320 13:40:04.347770 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.183024 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.184569 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.186796 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.186988 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.186998 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.203350 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-mcvqq" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.204751 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.206075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwpc\" (UniqueName: \"kubernetes.io/projected/b4cd9c2d-3b16-4152-9269-263b91fa4769-kube-api-access-lzwpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.206124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.206171 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.206201 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.206220 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.224419 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.307886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwpc\" (UniqueName: \"kubernetes.io/projected/b4cd9c2d-3b16-4152-9269-263b91fa4769-kube-api-access-lzwpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.308479 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.308580 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.308632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.308664 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.310008 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.314600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.315776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.334195 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b4cd9c2d-3b16-4152-9269-263b91fa4769-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.335607 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwpc\" (UniqueName: \"kubernetes.io/projected/b4cd9c2d-3b16-4152-9269-263b91fa4769-kube-api-access-lzwpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk\" (UID: \"b4cd9c2d-3b16-4152-9269-263b91fa4769\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.380665 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.385463 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.388493 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.388645 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.389346 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.417428 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.489588 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.490781 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.493857 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.494055 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514160 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514227 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514308 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz85x\" (UniqueName: \"kubernetes.io/projected/384ff1a6-c0b2-4b58-aac3-e847f789de25-kube-api-access-zz85x\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514382 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.514419 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.520795 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.528788 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615380 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615715 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pf5m\" (UniqueName: \"kubernetes.io/projected/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-kube-api-access-9pf5m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615796 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615943 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.615981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz85x\" (UniqueName: \"kubernetes.io/projected/384ff1a6-c0b2-4b58-aac3-e847f789de25-kube-api-access-zz85x\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.616008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.616031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.616056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.616097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.617372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.617726 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/384ff1a6-c0b2-4b58-aac3-e847f789de25-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.622575 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.632018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.632573 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.632826 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.633317 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/384ff1a6-c0b2-4b58-aac3-e847f789de25-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.633538 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.636051 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.636516 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.636780 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.637698 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.637996 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.638377 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.638728 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-fktzl" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.660838 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.662301 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz85x\" (UniqueName: \"kubernetes.io/projected/384ff1a6-c0b2-4b58-aac3-e847f789de25-kube-api-access-zz85x\") pod \"cloudkitty-lokistack-querier-668f98fdd7-ltb4d\" (UID: \"384ff1a6-c0b2-4b58-aac3-e847f789de25\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.675436 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg"] Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.708953 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.718104 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pf5m\" (UniqueName: \"kubernetes.io/projected/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-kube-api-access-9pf5m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.718260 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.718330 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.718361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.718440 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.721859 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.722859 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.725920 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.726009 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.745380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pf5m\" (UniqueName: \"kubernetes.io/projected/97b1a9d8-e379-4fe0-9036-3c05e9620b4a-kube-api-access-9pf5m\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-h8n6z\" (UID: \"97b1a9d8-e379-4fe0-9036-3c05e9620b4a\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.815619 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822720 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822769 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822801 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nhzq\" (UniqueName: \"kubernetes.io/projected/faa3805b-edc0-4e1a-91e5-05667f94e119-kube-api-access-7nhzq\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822883 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822901 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822947 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.822968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823026 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvpc\" (UniqueName: \"kubernetes.io/projected/9eda3cc0-3576-46cb-8da1-12ca651af767-kube-api-access-qrvpc\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823048 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823069 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823089 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.823152 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924407 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924499 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nhzq\" (UniqueName: \"kubernetes.io/projected/faa3805b-edc0-4e1a-91e5-05667f94e119-kube-api-access-7nhzq\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924540 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924566 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924585 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: E0320 13:40:07.924692 4895 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924698 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: E0320 13:40:07.924815 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret podName:9eda3cc0-3576-46cb-8da1-12ca651af767 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:08.424771454 +0000 UTC m=+1107.934490430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret") pod "cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" (UID: "9eda3cc0-3576-46cb-8da1-12ca651af767") : secret "cloudkitty-lokistack-gateway-http" not found Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvpc\" (UniqueName: \"kubernetes.io/projected/9eda3cc0-3576-46cb-8da1-12ca651af767-kube-api-access-qrvpc\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924947 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.924988 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925135 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925581 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925651 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.925793 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.926000 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: E0320 13:40:07.926073 4895 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Mar 20 13:40:07 crc kubenswrapper[4895]: E0320 13:40:07.926118 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret podName:faa3805b-edc0-4e1a-91e5-05667f94e119 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:08.426103057 +0000 UTC m=+1107.935822023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret") pod "cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" (UID: "faa3805b-edc0-4e1a-91e5-05667f94e119") : secret "cloudkitty-lokistack-gateway-http" not found Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.926366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/faa3805b-edc0-4e1a-91e5-05667f94e119-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.926862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.937146 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.937270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.937452 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.940862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.943039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.943444 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.945977 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.946287 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9eda3cc0-3576-46cb-8da1-12ca651af767-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.947015 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nhzq\" (UniqueName: \"kubernetes.io/projected/faa3805b-edc0-4e1a-91e5-05667f94e119-kube-api-access-7nhzq\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:07 crc kubenswrapper[4895]: I0320 13:40:07.949057 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvpc\" (UniqueName: \"kubernetes.io/projected/9eda3cc0-3576-46cb-8da1-12ca651af767-kube-api-access-qrvpc\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.410328 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.411780 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.415757 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.416249 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.424263 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.434476 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.434807 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.451983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9eda3cc0-3576-46cb-8da1-12ca651af767-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-82dcg\" (UID: \"9eda3cc0-3576-46cb-8da1-12ca651af767\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.452667 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/faa3805b-edc0-4e1a-91e5-05667f94e119-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-n2t45\" (UID: \"faa3805b-edc0-4e1a-91e5-05667f94e119\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.489273 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.490799 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.493576 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.493855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.504690 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537511 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537588 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537640 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537728 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.537911 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrpdn\" (UniqueName: \"kubernetes.io/projected/27c73d65-3dcb-44cb-a61e-004919dda8b4-kube-api-access-qrpdn\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.538078 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.538175 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.551402 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.553108 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.555047 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.557467 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.560053 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.597932 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.605180 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.639603 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.639655 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.639734 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.639836 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.640786 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpdn\" (UniqueName: \"kubernetes.io/projected/27c73d65-3dcb-44cb-a61e-004919dda8b4-kube-api-access-qrpdn\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.640848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.640915 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.640992 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641214 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641337 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4nc\" (UniqueName: \"kubernetes.io/projected/2c72a116-103e-4be6-91c2-65168b4d456e-kube-api-access-zl4nc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641451 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641481 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641718 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641765 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641790 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641769 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pqp\" (UniqueName: \"kubernetes.io/projected/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-kube-api-access-h4pqp\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641821 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641836 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.641866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.642496 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.642644 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73d65-3dcb-44cb-a61e-004919dda8b4-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.642871 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.648235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.648749 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/27c73d65-3dcb-44cb-a61e-004919dda8b4-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.654695 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrpdn\" (UniqueName: \"kubernetes.io/projected/27c73d65-3dcb-44cb-a61e-004919dda8b4-kube-api-access-qrpdn\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.667084 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.704265 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"27c73d65-3dcb-44cb-a61e-004919dda8b4\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743683 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743731 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743757 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pqp\" (UniqueName: \"kubernetes.io/projected/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-kube-api-access-h4pqp\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743778 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743815 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.743961 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.744000 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.744024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.744050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.744049 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.744070 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4nc\" (UniqueName: \"kubernetes.io/projected/2c72a116-103e-4be6-91c2-65168b4d456e-kube-api-access-zl4nc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.745142 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.745474 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.745497 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.745513 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.745666 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.746311 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.746971 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.748038 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.749348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.752080 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.753291 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.756225 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c72a116-103e-4be6-91c2-65168b4d456e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.757343 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.768932 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4nc\" (UniqueName: \"kubernetes.io/projected/2c72a116-103e-4be6-91c2-65168b4d456e-kube-api-access-zl4nc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.772266 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pqp\" (UniqueName: \"kubernetes.io/projected/29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e-kube-api-access-h4pqp\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.773913 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.774230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2c72a116-103e-4be6-91c2-65168b4d456e\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.807002 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.818713 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:08 crc kubenswrapper[4895]: I0320 13:40:08.875106 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:08 crc kubenswrapper[4895]: E0320 13:40:08.991550 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:40:08 crc kubenswrapper[4895]: E0320 13:40:08.991732 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldwb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-g2x4n_openstack(5e11919e-55bb-43f6-8613-abb6d70088f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:40:08 crc kubenswrapper[4895]: E0320 13:40:08.993118 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" podUID="5e11919e-55bb-43f6-8613-abb6d70088f4" Mar 20 13:40:09 crc kubenswrapper[4895]: E0320 13:40:09.106436 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:40:09 crc kubenswrapper[4895]: E0320 13:40:09.106898 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g277z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-52rlf_openstack(dd6b9801-a5b2-4f9c-99a6-aec4c81bd381): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:40:09 crc kubenswrapper[4895]: E0320 13:40:09.108525 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" podUID="dd6b9801-a5b2-4f9c-99a6-aec4c81bd381" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.118502 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.120167 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.122541 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ssf2t" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.122794 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.122813 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.123004 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.134153 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.258841 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.258893 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.258942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.259017 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.259039 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsst8\" (UniqueName: \"kubernetes.io/projected/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-kube-api-access-vsst8\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.259058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.259083 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.259110 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0be91e14-3ddd-4a62-814b-c6481403a621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be91e14-3ddd-4a62-814b-c6481403a621\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.298295 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360430 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360482 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360589 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsst8\" (UniqueName: \"kubernetes.io/projected/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-kube-api-access-vsst8\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360609 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360660 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0be91e14-3ddd-4a62-814b-c6481403a621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be91e14-3ddd-4a62-814b-c6481403a621\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.360701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.362153 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.364641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.364947 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-config\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.366871 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.366877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.367379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.368277 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.368304 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0be91e14-3ddd-4a62-814b-c6481403a621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be91e14-3ddd-4a62-814b-c6481403a621\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa8242c49b07f04dc141dfca6bff7e1ecffad9cf67c908d80d406c78c8ef725a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.401879 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsst8\" (UniqueName: \"kubernetes.io/projected/5761f186-a7a3-4ce2-8ed9-bcea12b186c8-kube-api-access-vsst8\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.415498 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0be91e14-3ddd-4a62-814b-c6481403a621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be91e14-3ddd-4a62-814b-c6481403a621\") pod \"ovsdbserver-sb-0\" (UID: \"5761f186-a7a3-4ce2-8ed9-bcea12b186c8\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.439983 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:09 crc kubenswrapper[4895]: I0320 13:40:09.813138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" event={"ID":"775fc130-07f0-45cc-88b8-357b47d31d40","Type":"ContainerStarted","Data":"56c889fd550c61e282500a1775fb312f7d277c45eb88a535fdc548e807460c23"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.033893 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.249703 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:40:10 crc kubenswrapper[4895]: W0320 13:40:10.339935 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5914e04_ecb0_4e00_8b39_8fbc9abf6afb.slice/crio-9e59724f11df6c39e6b0fde7233bbf7a058f67fc67c7366508864c8f64892bd5 WatchSource:0}: Error finding container 9e59724f11df6c39e6b0fde7233bbf7a058f67fc67c7366508864c8f64892bd5: Status 404 returned error can't find the container with id 9e59724f11df6c39e6b0fde7233bbf7a058f67fc67c7366508864c8f64892bd5 Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.585215 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.590105 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.712602 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g277z\" (UniqueName: \"kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z\") pod \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713012 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc\") pod \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713057 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config\") pod \"5e11919e-55bb-43f6-8613-abb6d70088f4\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713088 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldwb6\" (UniqueName: \"kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6\") pod \"5e11919e-55bb-43f6-8613-abb6d70088f4\" (UID: \"5e11919e-55bb-43f6-8613-abb6d70088f4\") " Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713255 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config\") pod \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\" (UID: \"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381\") " Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713804 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config" (OuterVolumeSpecName: "config") pod "5e11919e-55bb-43f6-8613-abb6d70088f4" (UID: "5e11919e-55bb-43f6-8613-abb6d70088f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713834 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381" (UID: "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.713914 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config" (OuterVolumeSpecName: "config") pod "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381" (UID: "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.714325 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.714354 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e11919e-55bb-43f6-8613-abb6d70088f4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.714366 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.726115 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4phvm"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.738629 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6" (OuterVolumeSpecName: "kube-api-access-ldwb6") pod "5e11919e-55bb-43f6-8613-abb6d70088f4" (UID: "5e11919e-55bb-43f6-8613-abb6d70088f4"). InnerVolumeSpecName "kube-api-access-ldwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.747751 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-4rshv"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.756231 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z" (OuterVolumeSpecName: "kube-api-access-g277z") pod "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381" (UID: "dd6b9801-a5b2-4f9c-99a6-aec4c81bd381"). InnerVolumeSpecName "kube-api-access-g277z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.793606 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.822881 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g277z\" (UniqueName: \"kubernetes.io/projected/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381-kube-api-access-g277z\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.822917 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldwb6\" (UniqueName: \"kubernetes.io/projected/5e11919e-55bb-43f6-8613-abb6d70088f4-kube-api-access-ldwb6\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.832881 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.844349 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b3c4f62-dc8a-49bd-b97e-d57133678e19","Type":"ContainerStarted","Data":"d8654f19fc714bf656c001cd67c7b0b132d2ed4df0ef41e2366857d410fc0680"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.849674 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.849900 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g2x4n" event={"ID":"5e11919e-55bb-43f6-8613-abb6d70088f4","Type":"ContainerDied","Data":"83d6fd0a0107a81543055125d5736fad16cc72a718db362b0edca0467698799a"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.857218 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.866548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" event={"ID":"dd6b9801-a5b2-4f9c-99a6-aec4c81bd381","Type":"ContainerDied","Data":"02a27afbe1d0a6823803b20308e5d44057c51808ea7cca781d2963604b7ab8d6"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.866636 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-52rlf" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.868246 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-4rshv" event={"ID":"f341d72e-a04d-4f58-a7f9-bed0b19710ae","Type":"ContainerStarted","Data":"cd3e260108bcacb9a5a68d4610fbebf2c681e32d74816a9aea632038b23e12c2"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.870976 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.871013 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" event={"ID":"faa3805b-edc0-4e1a-91e5-05667f94e119","Type":"ContainerStarted","Data":"5bfcb29722e6a196229662a45d0b6b7f76cdb0bfa9cb9537a9f9ff51b5fcac38"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.878161 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.884712 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.894887 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: E0320 13:40:10.903732 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzwpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk_openstack(b4cd9c2d-3b16-4152-9269-263b91fa4769): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.903923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d413e49a-6f03-44fc-87bf-f6b71efac9ad","Type":"ContainerStarted","Data":"cafa7e659184a98bc43917f26f8e66ffa8d50cc29fbf5cee4a36288385920016"} Mar 20 13:40:10 crc kubenswrapper[4895]: E0320 13:40:10.905105 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" podUID="b4cd9c2d-3b16-4152-9269-263b91fa4769" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.905163 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.913995 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.914877 4895 generic.go:334] "Generic (PLEG): container finished" podID="775fc130-07f0-45cc-88b8-357b47d31d40" containerID="0c4fa97276a3f95dbaa51409217ec0946b559dc47ad924a834afbb5c6fdf4646" exitCode=0 Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.914928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" event={"ID":"775fc130-07f0-45cc-88b8-357b47d31d40","Type":"ContainerDied","Data":"0c4fa97276a3f95dbaa51409217ec0946b559dc47ad924a834afbb5c6fdf4646"} Mar 20 13:40:10 crc kubenswrapper[4895]: E0320 13:40:10.915595 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n57h674h58fh74h5f8h574h64chbch689h5bch87h57fh68h588h5ddhf5h5d4h565h5ddh56dh5d9h565h5fbh5c7h9fh84hb7hcdh86h89hf4h58fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j62dm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(7c64e6c1-1601-4c6d-9cfe-2287e9147576): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:10 crc kubenswrapper[4895]: E0320 13:40:10.917557 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/memcached-0" podUID="7c64e6c1-1601-4c6d-9cfe-2287e9147576" Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.922507 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerStarted","Data":"e7c115aca404b1360a94b3f6f5fee2200a739de32d7248fc9c2801936a52ae8d"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.924832 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.927450 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm" event={"ID":"f0db633f-39ca-4915-ab69-a17d9140e31b","Type":"ContainerStarted","Data":"2b572b5ec492395dee5dda2aa272bb6c53ec479e969b1f4e283b0ee352e356a5"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.929172 4895 generic.go:334] "Generic (PLEG): container finished" podID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerID="f05c099156f9e6c72a91a336e8ac885c6498762e2dba67af8a19fa1cabe34194" exitCode=0 Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.929200 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" event={"ID":"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb","Type":"ContainerDied","Data":"f05c099156f9e6c72a91a336e8ac885c6498762e2dba67af8a19fa1cabe34194"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.929215 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" event={"ID":"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb","Type":"ContainerStarted","Data":"9e59724f11df6c39e6b0fde7233bbf7a058f67fc67c7366508864c8f64892bd5"} Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.988644 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:40:10 crc kubenswrapper[4895]: I0320 13:40:10.996207 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g2x4n"] Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.010909 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 20 13:40:11 crc kubenswrapper[4895]: W0320 13:40:11.027911 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c72a116_103e_4be6_91c2_65168b4d456e.slice/crio-424a53e03f79c39c1ccda50545f771f7fb212d7c04073c63bfa41b95ee8f9a26 WatchSource:0}: Error finding container 424a53e03f79c39c1ccda50545f771f7fb212d7c04073c63bfa41b95ee8f9a26: Status 404 returned error can't find the container with id 424a53e03f79c39c1ccda50545f771f7fb212d7c04073c63bfa41b95ee8f9a26 Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.030521 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-index-gateway,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=index-gateway -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zl4nc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-index-gateway-0_openstack(2c72a116-103e-4be6-91c2-65168b4d456e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.034921 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="2c72a116-103e-4be6-91c2-65168b4d456e" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.038546 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.045956 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-52rlf"] Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.131029 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.141184 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d"] Mar 20 13:40:11 crc kubenswrapper[4895]: W0320 13:40:11.142318 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27c73d65_3dcb_44cb_a61e_004919dda8b4.slice/crio-c40979fd02178d99d2d523c867316475994d5acfc96560d2627b0c50bc5d3619 WatchSource:0}: Error finding container c40979fd02178d99d2d523c867316475994d5acfc96560d2627b0c50bc5d3619: Status 404 returned error can't find the container with id c40979fd02178d99d2d523c867316475994d5acfc96560d2627b0c50bc5d3619 Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.144956 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z"] Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.151402 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg"] Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.152278 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9pf5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-6f54889599-h8n6z_openstack(97b1a9d8-e379-4fe0-9036-3c05e9620b4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.154339 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" podUID="97b1a9d8-e379-4fe0-9036-3c05e9620b4a" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.172619 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zz85x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-668f98fdd7-ltb4d_openstack(384ff1a6-c0b2-4b58-aac3-e847f789de25): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.175086 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" podUID="384ff1a6-c0b2-4b58-aac3-e847f789de25" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.180114 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:86738dca2db56373b411adb51a4dfde274968f8c8fde42dbc2daf4bac3bb8daf,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrvpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-6b884dc4b5-82dcg_openstack(9eda3cc0-3576-46cb-8da1-12ca651af767): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.181760 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" podUID="9eda3cc0-3576-46cb-8da1-12ca651af767" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.229636 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e11919e-55bb-43f6-8613-abb6d70088f4" path="/var/lib/kubelet/pods/5e11919e-55bb-43f6-8613-abb6d70088f4/volumes" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.230011 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6b9801-a5b2-4f9c-99a6-aec4c81bd381" path="/var/lib/kubelet/pods/dd6b9801-a5b2-4f9c-99a6-aec4c81bd381/volumes" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.230685 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mvskb"] Mar 20 13:40:11 crc kubenswrapper[4895]: W0320 13:40:11.238317 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea15e08a_0dc3_4b15_a90a_e06ae11a2ac9.slice/crio-8070ce427ecedd9f9730ef2bb39fd2023b05e1a6f60bd05d5047235d775eb1b9 WatchSource:0}: Error finding container 8070ce427ecedd9f9730ef2bb39fd2023b05e1a6f60bd05d5047235d775eb1b9: Status 404 returned error can't find the container with id 8070ce427ecedd9f9730ef2bb39fd2023b05e1a6f60bd05d5047235d775eb1b9 Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.938984 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvskb" event={"ID":"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9","Type":"ContainerStarted","Data":"8070ce427ecedd9f9730ef2bb39fd2023b05e1a6f60bd05d5047235d775eb1b9"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.941065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f27bbad-8a84-4902-8349-8c8724552442","Type":"ContainerStarted","Data":"6bd465cd027abf03f28efae2a3be37dab0a7fdd76d5d0099d41fbfe8fb34736b"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.942318 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2c72a116-103e-4be6-91c2-65168b4d456e","Type":"ContainerStarted","Data":"424a53e03f79c39c1ccda50545f771f7fb212d7c04073c63bfa41b95ee8f9a26"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.944244 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="2c72a116-103e-4be6-91c2-65168b4d456e" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.951291 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" event={"ID":"9eda3cc0-3576-46cb-8da1-12ca651af767","Type":"ContainerStarted","Data":"216fada6bf78e19cbe54b5574976d296a061150f91c92c6a97175b85174cde9d"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.952826 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:86738dca2db56373b411adb51a4dfde274968f8c8fde42dbc2daf4bac3bb8daf\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" podUID="9eda3cc0-3576-46cb-8da1-12ca651af767" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.953493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7c64e6c1-1601-4c6d-9cfe-2287e9147576","Type":"ContainerStarted","Data":"5888fc6da13cb71f3ed6de32ae120d977ba37a54a1ff8303407bf44bde0d9a09"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.954335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="7c64e6c1-1601-4c6d-9cfe-2287e9147576" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.955344 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" event={"ID":"384ff1a6-c0b2-4b58-aac3-e847f789de25","Type":"ContainerStarted","Data":"a034a16d600c3ea340a2f266f858db5764d9186a642dc6b4ea8a8662fdbd6297"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.956619 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" podUID="384ff1a6-c0b2-4b58-aac3-e847f789de25" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.958419 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" event={"ID":"b4cd9c2d-3b16-4152-9269-263b91fa4769","Type":"ContainerStarted","Data":"1563457e346b657b69ef1e10e05ce62afa061b99c8f32d5f4bc8ba3a4d611c66"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.960322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c851b618-6bf5-4291-ae40-20ed962dfe46","Type":"ContainerStarted","Data":"5bc2322bb565172dd61879d4b272d161653cec0fff8dc55f7496891e71a3f75c"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.961754 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" podUID="b4cd9c2d-3b16-4152-9269-263b91fa4769" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.969430 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" event={"ID":"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb","Type":"ContainerStarted","Data":"8875685e2299356974f7dedeb06fff52da5f56f81e2137bf2f77f3edff84b683"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.969771 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.973770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e","Type":"ContainerStarted","Data":"d310b3c4391b16142b8dc0f2089ad3dd471fa24e02b4d819d50a026bbdd7b939"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.975527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" event={"ID":"97b1a9d8-e379-4fe0-9036-3c05e9620b4a","Type":"ContainerStarted","Data":"efe2ef8a7ce25bcbc43126071414d2738577e635320657dafd2a13e422f1d3e9"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.976872 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerStarted","Data":"658f3576af50f1584711ea10ef11af77140ffd413ea8eb4a8544af05fc420721"} Mar 20 13:40:11 crc kubenswrapper[4895]: E0320 13:40:11.980186 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" podUID="97b1a9d8-e379-4fe0-9036-3c05e9620b4a" Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.989071 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"27c73d65-3dcb-44cb-a61e-004919dda8b4","Type":"ContainerStarted","Data":"c40979fd02178d99d2d523c867316475994d5acfc96560d2627b0c50bc5d3619"} Mar 20 13:40:11 crc kubenswrapper[4895]: I0320 13:40:11.995837 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerStarted","Data":"c78eda001f0e711a8e0d3121a22d10d8539ca73e0c16b1edd860fc76c4d5b120"} Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.014315 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" event={"ID":"775fc130-07f0-45cc-88b8-357b47d31d40","Type":"ContainerStarted","Data":"9313604646fb5e94a0fd71d1e10533c0d80e5848ddc14e2ff988fe81d1f51052"} Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.014647 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.069584 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.078721 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" podStartSLOduration=18.136141548 podStartE2EDuration="19.078704298s" podCreationTimestamp="2026-03-20 13:39:53 +0000 UTC" firstStartedPulling="2026-03-20 13:40:09.412158155 +0000 UTC m=+1108.921877121" lastFinishedPulling="2026-03-20 13:40:10.354720885 +0000 UTC m=+1109.864439871" observedRunningTime="2026-03-20 13:40:12.068515165 +0000 UTC m=+1111.578234141" watchObservedRunningTime="2026-03-20 13:40:12.078704298 +0000 UTC m=+1111.588423264" Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.091029 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" podStartSLOduration=19.09100865 podStartE2EDuration="19.09100865s" podCreationTimestamp="2026-03-20 13:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:12.0860897 +0000 UTC m=+1111.595808666" watchObservedRunningTime="2026-03-20 13:40:12.09100865 +0000 UTC m=+1111.600727616" Mar 20 13:40:12 crc kubenswrapper[4895]: W0320 13:40:12.150491 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5761f186_a7a3_4ce2_8ed9_bcea12b186c8.slice/crio-b9d92c9ccd24bdd98bcddba73c1ef08ad26f5a17b6f94f3464e32ee1104efd70 WatchSource:0}: Error finding container b9d92c9ccd24bdd98bcddba73c1ef08ad26f5a17b6f94f3464e32ee1104efd70: Status 404 returned error can't find the container with id b9d92c9ccd24bdd98bcddba73c1ef08ad26f5a17b6f94f3464e32ee1104efd70 Mar 20 13:40:12 crc kubenswrapper[4895]: I0320 13:40:12.240731 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:40:13 crc kubenswrapper[4895]: I0320 13:40:13.024704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5761f186-a7a3-4ce2-8ed9-bcea12b186c8","Type":"ContainerStarted","Data":"b9d92c9ccd24bdd98bcddba73c1ef08ad26f5a17b6f94f3464e32ee1104efd70"} Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.028292 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" podUID="384ff1a6-c0b2-4b58-aac3-e847f789de25" Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.028327 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" podUID="b4cd9c2d-3b16-4152-9269-263b91fa4769" Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.028404 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="2c72a116-103e-4be6-91c2-65168b4d456e" Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.028426 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="7c64e6c1-1601-4c6d-9cfe-2287e9147576" Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.028776 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" podUID="97b1a9d8-e379-4fe0-9036-3c05e9620b4a" Mar 20 13:40:13 crc kubenswrapper[4895]: E0320 13:40:13.029923 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:86738dca2db56373b411adb51a4dfde274968f8c8fde42dbc2daf4bac3bb8daf\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" podUID="9eda3cc0-3576-46cb-8da1-12ca651af767" Mar 20 13:40:13 crc kubenswrapper[4895]: W0320 13:40:13.471077 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf691a5e_1267_46ec_9d39_f4fa047a1741.slice/crio-0f17203e6c046ed9a626cef020144f9bad95b5bbfe88f833f65f6127e376adf6 WatchSource:0}: Error finding container 0f17203e6c046ed9a626cef020144f9bad95b5bbfe88f833f65f6127e376adf6: Status 404 returned error can't find the container with id 0f17203e6c046ed9a626cef020144f9bad95b5bbfe88f833f65f6127e376adf6 Mar 20 13:40:14 crc kubenswrapper[4895]: I0320 13:40:14.032006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af691a5e-1267-46ec-9d39-f4fa047a1741","Type":"ContainerStarted","Data":"0f17203e6c046ed9a626cef020144f9bad95b5bbfe88f833f65f6127e376adf6"} Mar 20 13:40:19 crc kubenswrapper[4895]: I0320 13:40:19.030736 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:40:19 crc kubenswrapper[4895]: I0320 13:40:19.327686 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:40:19 crc kubenswrapper[4895]: I0320 13:40:19.388795 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:40:19 crc kubenswrapper[4895]: I0320 13:40:19.389083 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="dnsmasq-dns" containerID="cri-o://9313604646fb5e94a0fd71d1e10533c0d80e5848ddc14e2ff988fe81d1f51052" gracePeriod=10 Mar 20 13:40:21 crc kubenswrapper[4895]: I0320 13:40:21.113709 4895 generic.go:334] "Generic (PLEG): container finished" podID="775fc130-07f0-45cc-88b8-357b47d31d40" containerID="9313604646fb5e94a0fd71d1e10533c0d80e5848ddc14e2ff988fe81d1f51052" exitCode=0 Mar 20 13:40:21 crc kubenswrapper[4895]: I0320 13:40:21.113826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" event={"ID":"775fc130-07f0-45cc-88b8-357b47d31d40","Type":"ContainerDied","Data":"9313604646fb5e94a0fd71d1e10533c0d80e5848ddc14e2ff988fe81d1f51052"} Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.140589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" event={"ID":"775fc130-07f0-45cc-88b8-357b47d31d40","Type":"ContainerDied","Data":"56c889fd550c61e282500a1775fb312f7d277c45eb88a535fdc548e807460c23"} Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.141245 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c889fd550c61e282500a1775fb312f7d277c45eb88a535fdc548e807460c23" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.158506 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.234128 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.304339 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94kp\" (UniqueName: \"kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp\") pod \"775fc130-07f0-45cc-88b8-357b47d31d40\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.304418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config\") pod \"775fc130-07f0-45cc-88b8-357b47d31d40\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.304534 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc\") pod \"775fc130-07f0-45cc-88b8-357b47d31d40\" (UID: \"775fc130-07f0-45cc-88b8-357b47d31d40\") " Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.307487 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp" (OuterVolumeSpecName: "kube-api-access-x94kp") pod "775fc130-07f0-45cc-88b8-357b47d31d40" (UID: "775fc130-07f0-45cc-88b8-357b47d31d40"). InnerVolumeSpecName "kube-api-access-x94kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.339661 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config" (OuterVolumeSpecName: "config") pod "775fc130-07f0-45cc-88b8-357b47d31d40" (UID: "775fc130-07f0-45cc-88b8-357b47d31d40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.349138 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "775fc130-07f0-45cc-88b8-357b47d31d40" (UID: "775fc130-07f0-45cc-88b8-357b47d31d40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.407964 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.407994 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94kp\" (UniqueName: \"kubernetes.io/projected/775fc130-07f0-45cc-88b8-357b47d31d40-kube-api-access-x94kp\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:24 crc kubenswrapper[4895]: I0320 13:40:24.408006 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775fc130-07f0-45cc-88b8-357b47d31d40-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:24 crc kubenswrapper[4895]: E0320 13:40:24.884656 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 13:40:24 crc kubenswrapper[4895]: E0320 13:40:24.884702 4895 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 13:40:24 crc kubenswrapper[4895]: E0320 13:40:24.884848 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhb6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(9f27bbad-8a84-4902-8349-8c8724552442): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:40:24 crc kubenswrapper[4895]: E0320 13:40:24.886044 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="9f27bbad-8a84-4902-8349-8c8724552442" Mar 20 13:40:25 crc kubenswrapper[4895]: I0320 13:40:25.147726 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" Mar 20 13:40:25 crc kubenswrapper[4895]: E0320 13:40:25.150100 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="9f27bbad-8a84-4902-8349-8c8724552442" Mar 20 13:40:25 crc kubenswrapper[4895]: I0320 13:40:25.233030 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:40:25 crc kubenswrapper[4895]: I0320 13:40:25.243264 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gz6jx"] Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.178442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af691a5e-1267-46ec-9d39-f4fa047a1741","Type":"ContainerStarted","Data":"fdd4f410f60ef6f5ac3a0c75df0b46cbc6ede8dddd69fc1648cf2a9d94e9debd"} Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.180278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" event={"ID":"faa3805b-edc0-4e1a-91e5-05667f94e119","Type":"ContainerStarted","Data":"b03e9c308c17d9b4b58a25ef24c834bf25a532c943b05c1411e8f684f8ed7b68"} Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.180470 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.181855 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b3c4f62-dc8a-49bd-b97e-d57133678e19","Type":"ContainerStarted","Data":"5b173d7fcb04e1780e07a221f6c6e993fbee0419e9420c7087bdb92b4b579b16"} Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.187714 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e","Type":"ContainerStarted","Data":"ca1470e5ec801fa2e5a7e2d943b4d78ed2f9b625f659cd09392315ab13fe369f"} Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.187905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.189797 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c851b618-6bf5-4291-ae40-20ed962dfe46","Type":"ContainerStarted","Data":"f63d8e9012fe8ae5fd29bffab4162c08ee9580e581a612047e73da7251fea8fd"} Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.194342 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.201057 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-n2t45" podStartSLOduration=6.787207737 podStartE2EDuration="20.20104184s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.841210246 +0000 UTC m=+1110.350929212" lastFinishedPulling="2026-03-20 13:40:24.255044349 +0000 UTC m=+1123.764763315" observedRunningTime="2026-03-20 13:40:27.199233696 +0000 UTC m=+1126.708952662" watchObservedRunningTime="2026-03-20 13:40:27.20104184 +0000 UTC m=+1126.710760806" Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.225239 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" path="/var/lib/kubelet/pods/775fc130-07f0-45cc-88b8-357b47d31d40/volumes" Mar 20 13:40:27 crc kubenswrapper[4895]: I0320 13:40:27.290662 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=6.614618406 podStartE2EDuration="20.290643302s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.842447266 +0000 UTC m=+1110.352166232" lastFinishedPulling="2026-03-20 13:40:24.518472162 +0000 UTC m=+1124.028191128" observedRunningTime="2026-03-20 13:40:27.289483753 +0000 UTC m=+1126.799202739" watchObservedRunningTime="2026-03-20 13:40:27.290643302 +0000 UTC m=+1126.800362268" Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.206193 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerStarted","Data":"7f0ecc47a978afc25c2a7716be49f21ca24938da5b6654d45c79de22b1b4e5a1"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.208469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"27c73d65-3dcb-44cb-a61e-004919dda8b4","Type":"ContainerStarted","Data":"821d48a681aa35bd4da293a47288cdcb6e60540eda40e6d15a8f1b605f231bb1"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.208747 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.210928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvskb" event={"ID":"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9","Type":"ContainerStarted","Data":"38008828c6e791326a82cb492f9fbc66bae56114664d2b10362dc9775e81f4af"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.215465 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5761f186-a7a3-4ce2-8ed9-bcea12b186c8","Type":"ContainerStarted","Data":"a31cf2ab9a2a2281edaf9252a8e1af55d4cb637ae91ed6f1523be541d3bdd8ec"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.216997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" event={"ID":"97b1a9d8-e379-4fe0-9036-3c05e9620b4a","Type":"ContainerStarted","Data":"d3a6650a033a422fb9af756e01984090af14aa008584c0dab2fd99ecca77a880"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.217224 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.219405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerStarted","Data":"9aabfb7c063f6233a078d4c562ccf0edf17a494752a0173dd120f4d4b03ed45d"} Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.271970 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" podStartSLOduration=7.286985726 podStartE2EDuration="21.271946899s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.152161093 +0000 UTC m=+1110.661880059" lastFinishedPulling="2026-03-20 13:40:25.137122266 +0000 UTC m=+1124.646841232" observedRunningTime="2026-03-20 13:40:28.264524006 +0000 UTC m=+1127.774243012" watchObservedRunningTime="2026-03-20 13:40:28.271946899 +0000 UTC m=+1127.781665875" Mar 20 13:40:28 crc kubenswrapper[4895]: I0320 13:40:28.345480 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=8.09320027 podStartE2EDuration="21.345458624s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.144465672 +0000 UTC m=+1110.654184638" lastFinishedPulling="2026-03-20 13:40:24.396724006 +0000 UTC m=+1123.906442992" observedRunningTime="2026-03-20 13:40:28.33073354 +0000 UTC m=+1127.840452506" watchObservedRunningTime="2026-03-20 13:40:28.345458624 +0000 UTC m=+1127.855177610" Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.030538 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-gz6jx" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.229671 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerStarted","Data":"1f2c8768a0a5dc360d597d5566e03e8d30ef9148aaf6f8c528cbaa28af5faa51"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.232494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d413e49a-6f03-44fc-87bf-f6b71efac9ad","Type":"ContainerStarted","Data":"ebc7f50c119e6e9b8798079ade8e1d5b295f4b52b98c94ad89f69c80f838b508"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.235139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm" event={"ID":"f0db633f-39ca-4915-ab69-a17d9140e31b","Type":"ContainerStarted","Data":"2ca9d4ab24dc79161c6c0123067e057d985666fc98c7ac62fc6a4d5e1022cfec"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.235603 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4phvm" Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.237988 4895 generic.go:334] "Generic (PLEG): container finished" podID="f341d72e-a04d-4f58-a7f9-bed0b19710ae" containerID="26785eee0c449721bcfafa52af3939655f25e08cd85cee66791863157bd2c4c9" exitCode=0 Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.238091 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-4rshv" event={"ID":"f341d72e-a04d-4f58-a7f9-bed0b19710ae","Type":"ContainerDied","Data":"26785eee0c449721bcfafa52af3939655f25e08cd85cee66791863157bd2c4c9"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.240185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" event={"ID":"384ff1a6-c0b2-4b58-aac3-e847f789de25","Type":"ContainerStarted","Data":"c6c4aab8d1934e5b84277735676d0a9935de8c930df87079ea06cde095fd8064"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.240349 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.241952 4895 generic.go:334] "Generic (PLEG): container finished" podID="ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9" containerID="38008828c6e791326a82cb492f9fbc66bae56114664d2b10362dc9775e81f4af" exitCode=0 Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.241989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvskb" event={"ID":"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9","Type":"ContainerDied","Data":"38008828c6e791326a82cb492f9fbc66bae56114664d2b10362dc9775e81f4af"} Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.291645 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" podStartSLOduration=-9223372014.563156 podStartE2EDuration="22.291619992s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.172269659 +0000 UTC m=+1110.681988625" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:29.283451312 +0000 UTC m=+1128.793170278" watchObservedRunningTime="2026-03-20 13:40:29.291619992 +0000 UTC m=+1128.801338968" Mar 20 13:40:29 crc kubenswrapper[4895]: I0320 13:40:29.304049 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4phvm" podStartSLOduration=12.886229338 podStartE2EDuration="26.304027829s" podCreationTimestamp="2026-03-20 13:40:03 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.838448038 +0000 UTC m=+1110.348167004" lastFinishedPulling="2026-03-20 13:40:24.256246529 +0000 UTC m=+1123.765965495" observedRunningTime="2026-03-20 13:40:29.299405625 +0000 UTC m=+1128.809124591" watchObservedRunningTime="2026-03-20 13:40:29.304027829 +0000 UTC m=+1128.813746795" Mar 20 13:40:30 crc kubenswrapper[4895]: I0320 13:40:30.997567 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.042079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrj4l\" (UniqueName: \"kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l\") pod \"f341d72e-a04d-4f58-a7f9-bed0b19710ae\" (UID: \"f341d72e-a04d-4f58-a7f9-bed0b19710ae\") " Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.058736 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l" (OuterVolumeSpecName: "kube-api-access-mrj4l") pod "f341d72e-a04d-4f58-a7f9-bed0b19710ae" (UID: "f341d72e-a04d-4f58-a7f9-bed0b19710ae"). InnerVolumeSpecName "kube-api-access-mrj4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.144639 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrj4l\" (UniqueName: \"kubernetes.io/projected/f341d72e-a04d-4f58-a7f9-bed0b19710ae-kube-api-access-mrj4l\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.264737 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c851b618-6bf5-4291-ae40-20ed962dfe46","Type":"ContainerDied","Data":"f63d8e9012fe8ae5fd29bffab4162c08ee9580e581a612047e73da7251fea8fd"} Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.264847 4895 generic.go:334] "Generic (PLEG): container finished" podID="c851b618-6bf5-4291-ae40-20ed962dfe46" containerID="f63d8e9012fe8ae5fd29bffab4162c08ee9580e581a612047e73da7251fea8fd" exitCode=0 Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.271499 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-4rshv" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.271522 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-4rshv" event={"ID":"f341d72e-a04d-4f58-a7f9-bed0b19710ae","Type":"ContainerDied","Data":"cd3e260108bcacb9a5a68d4610fbebf2c681e32d74816a9aea632038b23e12c2"} Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.271582 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd3e260108bcacb9a5a68d4610fbebf2c681e32d74816a9aea632038b23e12c2" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.290414 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" event={"ID":"b4cd9c2d-3b16-4152-9269-263b91fa4769","Type":"ContainerStarted","Data":"32d37ea6e4965c442dfb6c1c6b64014ee63319712b75d79b7e8d1b3bfbd28b19"} Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.290651 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:31 crc kubenswrapper[4895]: I0320 13:40:31.348632 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" podStartSLOduration=-9223372012.506163 podStartE2EDuration="24.348612946s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.903610126 +0000 UTC m=+1110.413329092" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:31.315462408 +0000 UTC m=+1130.825181374" watchObservedRunningTime="2026-03-20 13:40:31.348612946 +0000 UTC m=+1130.858331912" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.072407 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-gpgqn"] Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.086118 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-gpgqn"] Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.302204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5761f186-a7a3-4ce2-8ed9-bcea12b186c8","Type":"ContainerStarted","Data":"a8cb93e56ba969a83e6ddb09093c120ce93cdd389f606db46491b4f1608bcb34"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.305591 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2c72a116-103e-4be6-91c2-65168b4d456e","Type":"ContainerStarted","Data":"57b50bd75a5a6668b163a073364142015428f7fa673983b0d81de05ea0bedaa9"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.305983 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.308676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" event={"ID":"9eda3cc0-3576-46cb-8da1-12ca651af767","Type":"ContainerStarted","Data":"2fab8f36776eae713949483f3a0dc52dcad227260edb859861b18690cdae07cd"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.309029 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.312619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c851b618-6bf5-4291-ae40-20ed962dfe46","Type":"ContainerStarted","Data":"a066401ea74a54b7248beb625a4fd6919e8008914d206e6aacfbecbff076aa1d"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.315494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7c64e6c1-1601-4c6d-9cfe-2287e9147576","Type":"ContainerStarted","Data":"241dd32edd9f92bd14eb1ea812f979e715f8fb28b77b589d165d8ff529563d67"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.315835 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.319091 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvskb" event={"ID":"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9","Type":"ContainerStarted","Data":"37d4d6e8881cfe9a76f97a37a2ad4e0b244e9adbc87448df355cec30121aefb5"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.319139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mvskb" event={"ID":"ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9","Type":"ContainerStarted","Data":"301e89e7c9ba05188a3f94a51eda68b5d67b92966ba6f0eb590406d5d2f6a648"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.320163 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.320224 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.322810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"af691a5e-1267-46ec-9d39-f4fa047a1741","Type":"ContainerStarted","Data":"4feb395d0f5c47e2736bb79215a7d4480deb3861f9d50aec0d5775b2330307ad"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.328216 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b3c4f62-dc8a-49bd-b97e-d57133678e19" containerID="5b173d7fcb04e1780e07a221f6c6e993fbee0419e9420c7087bdb92b4b579b16" exitCode=0 Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.328286 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b3c4f62-dc8a-49bd-b97e-d57133678e19","Type":"ContainerDied","Data":"5b173d7fcb04e1780e07a221f6c6e993fbee0419e9420c7087bdb92b4b579b16"} Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.338548 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.351616 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.209230149 podStartE2EDuration="24.351586348s" podCreationTimestamp="2026-03-20 13:40:08 +0000 UTC" firstStartedPulling="2026-03-20 13:40:12.154167581 +0000 UTC m=+1111.663886557" lastFinishedPulling="2026-03-20 13:40:31.2965238 +0000 UTC m=+1130.806242756" observedRunningTime="2026-03-20 13:40:32.32654375 +0000 UTC m=+1131.836262716" watchObservedRunningTime="2026-03-20 13:40:32.351586348 +0000 UTC m=+1131.861305354" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.358735 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.952308964 podStartE2EDuration="36.358713924s" podCreationTimestamp="2026-03-20 13:39:56 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.849598743 +0000 UTC m=+1110.359317709" lastFinishedPulling="2026-03-20 13:40:24.256003703 +0000 UTC m=+1123.765722669" observedRunningTime="2026-03-20 13:40:32.347861176 +0000 UTC m=+1131.857580182" watchObservedRunningTime="2026-03-20 13:40:32.358713924 +0000 UTC m=+1131.868432900" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.383749 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-82dcg" podStartSLOduration=-9223372011.471046 podStartE2EDuration="25.383729702s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.179962609 +0000 UTC m=+1110.689681575" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:32.376500623 +0000 UTC m=+1131.886219599" watchObservedRunningTime="2026-03-20 13:40:32.383729702 +0000 UTC m=+1131.893448668" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.405475 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.14646235 podStartE2EDuration="34.405450307s" podCreationTimestamp="2026-03-20 13:39:58 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.915197823 +0000 UTC m=+1110.424916789" lastFinishedPulling="2026-03-20 13:40:31.17418578 +0000 UTC m=+1130.683904746" observedRunningTime="2026-03-20 13:40:32.398047835 +0000 UTC m=+1131.907766841" watchObservedRunningTime="2026-03-20 13:40:32.405450307 +0000 UTC m=+1131.915169313" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.457592 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=-9223372011.397205 podStartE2EDuration="25.457571015s" podCreationTimestamp="2026-03-20 13:40:07 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.030351865 +0000 UTC m=+1110.540070831" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:32.423765619 +0000 UTC m=+1131.933484585" watchObservedRunningTime="2026-03-20 13:40:32.457571015 +0000 UTC m=+1131.967289981" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.478188 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mvskb" podStartSLOduration=16.936809318 podStartE2EDuration="29.478168353s" podCreationTimestamp="2026-03-20 13:40:03 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.243792844 +0000 UTC m=+1110.753511800" lastFinishedPulling="2026-03-20 13:40:23.785151869 +0000 UTC m=+1123.294870835" observedRunningTime="2026-03-20 13:40:32.450773777 +0000 UTC m=+1131.960492763" watchObservedRunningTime="2026-03-20 13:40:32.478168353 +0000 UTC m=+1131.987887319" Mar 20 13:40:32 crc kubenswrapper[4895]: I0320 13:40:32.514278 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.70682771 podStartE2EDuration="29.514258234s" podCreationTimestamp="2026-03-20 13:40:03 +0000 UTC" firstStartedPulling="2026-03-20 13:40:13.474319352 +0000 UTC m=+1112.984038318" lastFinishedPulling="2026-03-20 13:40:31.281749876 +0000 UTC m=+1130.791468842" observedRunningTime="2026-03-20 13:40:32.479531126 +0000 UTC m=+1131.989250172" watchObservedRunningTime="2026-03-20 13:40:32.514258234 +0000 UTC m=+1132.023977220" Mar 20 13:40:33 crc kubenswrapper[4895]: I0320 13:40:33.225614 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc86faca-eea0-4505-a809-8cbbdb6342fa" path="/var/lib/kubelet/pods/dc86faca-eea0-4505-a809-8cbbdb6342fa/volumes" Mar 20 13:40:33 crc kubenswrapper[4895]: I0320 13:40:33.440926 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:33 crc kubenswrapper[4895]: I0320 13:40:33.495062 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.349001 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.349523 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.353541 4895 generic.go:334] "Generic (PLEG): container finished" podID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerID="1f2c8768a0a5dc360d597d5566e03e8d30ef9148aaf6f8c528cbaa28af5faa51" exitCode=0 Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.353618 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerDied","Data":"1f2c8768a0a5dc360d597d5566e03e8d30ef9148aaf6f8c528cbaa28af5faa51"} Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.358775 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6b3c4f62-dc8a-49bd-b97e-d57133678e19","Type":"ContainerStarted","Data":"974ad6a70bdfcba84f1092881857a3a9a1363ba76291942f59d04a59e096db93"} Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.366351 4895 generic.go:334] "Generic (PLEG): container finished" podID="d413e49a-6f03-44fc-87bf-f6b71efac9ad" containerID="ebc7f50c119e6e9b8798079ade8e1d5b295f4b52b98c94ad89f69c80f838b508" exitCode=0 Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.366428 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d413e49a-6f03-44fc-87bf-f6b71efac9ad","Type":"ContainerDied","Data":"ebc7f50c119e6e9b8798079ade8e1d5b295f4b52b98c94ad89f69c80f838b508"} Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.367483 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.409071 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.433791 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.457852 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.77621381 podStartE2EDuration="39.457828317s" podCreationTimestamp="2026-03-20 13:39:55 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.837693888 +0000 UTC m=+1110.347412854" lastFinishedPulling="2026-03-20 13:40:23.519308395 +0000 UTC m=+1123.029027361" observedRunningTime="2026-03-20 13:40:34.450771943 +0000 UTC m=+1133.960490909" watchObservedRunningTime="2026-03-20 13:40:34.457828317 +0000 UTC m=+1133.967547283" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.631855 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:34 crc kubenswrapper[4895]: E0320 13:40:34.632268 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="init" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.632287 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="init" Mar 20 13:40:34 crc kubenswrapper[4895]: E0320 13:40:34.632318 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f341d72e-a04d-4f58-a7f9-bed0b19710ae" containerName="oc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.632326 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f341d72e-a04d-4f58-a7f9-bed0b19710ae" containerName="oc" Mar 20 13:40:34 crc kubenswrapper[4895]: E0320 13:40:34.632354 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="dnsmasq-dns" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.632362 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="dnsmasq-dns" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.632577 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f341d72e-a04d-4f58-a7f9-bed0b19710ae" containerName="oc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.632597 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="775fc130-07f0-45cc-88b8-357b47d31d40" containerName="dnsmasq-dns" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.633775 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.637217 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.642212 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.685922 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-46hcc"] Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.686967 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.688471 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.700714 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-46hcc"] Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733687 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovs-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733769 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncf8\" (UniqueName: \"kubernetes.io/projected/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-kube-api-access-jncf8\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733803 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-combined-ca-bundle\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733827 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-config\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733849 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovn-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733869 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733904 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733933 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733947 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.733988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqw5\" (UniqueName: \"kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovs-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836137 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jncf8\" (UniqueName: \"kubernetes.io/projected/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-kube-api-access-jncf8\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-combined-ca-bundle\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-config\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836219 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovn-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836271 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836339 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836378 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqw5\" (UniqueName: \"kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovs-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.836459 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-ovn-rundir\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.837167 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.837212 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-config\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.837564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.837569 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.842660 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.842942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-combined-ca-bundle\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.855713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncf8\" (UniqueName: \"kubernetes.io/projected/2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2-kube-api-access-jncf8\") pod \"ovn-controller-metrics-46hcc\" (UID: \"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2\") " pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.855826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqw5\" (UniqueName: \"kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5\") pod \"dnsmasq-dns-6bc7876d45-kcwnl\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:34 crc kubenswrapper[4895]: I0320 13:40:34.949692 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.005925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-46hcc" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.089731 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.101719 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.103139 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.105897 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.112895 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.141690 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.141757 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989jj\" (UniqueName: \"kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.141804 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.141839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.141884 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.244109 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.245172 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.245231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.245281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.245352 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989jj\" (UniqueName: \"kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.245712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.246334 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.246347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.246660 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.283685 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989jj\" (UniqueName: \"kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj\") pod \"dnsmasq-dns-8554648995-8z7w6\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.435846 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.471465 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:40:35 crc kubenswrapper[4895]: W0320 13:40:35.582216 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2be1f9cb_60c7_4ffd_8ac2_d7e47df959d2.slice/crio-9126bdf92fb3cd7a5b1f27ca11b2b0a7874e87a0b426d66df9f3ff94a0f9e34d WatchSource:0}: Error finding container 9126bdf92fb3cd7a5b1f27ca11b2b0a7874e87a0b426d66df9f3ff94a0f9e34d: Status 404 returned error can't find the container with id 9126bdf92fb3cd7a5b1f27ca11b2b0a7874e87a0b426d66df9f3ff94a0f9e34d Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.585118 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-46hcc"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.617247 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.676274 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.677991 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.686130 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.686330 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.686964 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.687016 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f8nzk" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.701553 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.863926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-scripts\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864243 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/676abdd9-331c-4d23-817d-a608c366a737-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864300 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864337 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864369 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2c9\" (UniqueName: \"kubernetes.io/projected/676abdd9-331c-4d23-817d-a608c366a737-kube-api-access-2r2c9\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.864403 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-config\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2c9\" (UniqueName: \"kubernetes.io/projected/676abdd9-331c-4d23-817d-a608c366a737-kube-api-access-2r2c9\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965735 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-config\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-scripts\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965865 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/676abdd9-331c-4d23-817d-a608c366a737-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965909 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.965943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.966703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-scripts\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.966748 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676abdd9-331c-4d23-817d-a608c366a737-config\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.967052 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/676abdd9-331c-4d23-817d-a608c366a737-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.971279 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.972433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.973874 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/676abdd9-331c-4d23-817d-a608c366a737-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:35 crc kubenswrapper[4895]: I0320 13:40:35.982657 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2c9\" (UniqueName: \"kubernetes.io/projected/676abdd9-331c-4d23-817d-a608c366a737-kube-api-access-2r2c9\") pod \"ovn-northd-0\" (UID: \"676abdd9-331c-4d23-817d-a608c366a737\") " pod="openstack/ovn-northd-0" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.024558 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.044298 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.392259 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f72b56f-f910-4f15-86ec-982800c24df8" containerID="f4b501ab06d91cfe73035e6bedbf91f380b3e4e674f4f09477333243de34af10" exitCode=0 Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.392348 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8z7w6" event={"ID":"3f72b56f-f910-4f15-86ec-982800c24df8","Type":"ContainerDied","Data":"f4b501ab06d91cfe73035e6bedbf91f380b3e4e674f4f09477333243de34af10"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.392923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8z7w6" event={"ID":"3f72b56f-f910-4f15-86ec-982800c24df8","Type":"ContainerStarted","Data":"c22436ca4cc7faefc4615b01c88ef8e1703819a18114e0cf0d17de2a649d166e"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.395100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-46hcc" event={"ID":"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2","Type":"ContainerStarted","Data":"4b756d0ce580513b01695a77c5c8ff2d323c5dcef70e58c93160480d45a8bfd9"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.395180 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-46hcc" event={"ID":"2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2","Type":"ContainerStarted","Data":"9126bdf92fb3cd7a5b1f27ca11b2b0a7874e87a0b426d66df9f3ff94a0f9e34d"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.398492 4895 generic.go:334] "Generic (PLEG): container finished" podID="ec5f08e6-fa09-470b-b3c0-42b296424886" containerID="266bf49e248cb9d651b2696a46844e661bcde1c05550a585bed4b019d45ad25c" exitCode=0 Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.398556 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" event={"ID":"ec5f08e6-fa09-470b-b3c0-42b296424886","Type":"ContainerDied","Data":"266bf49e248cb9d651b2696a46844e661bcde1c05550a585bed4b019d45ad25c"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.398625 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" event={"ID":"ec5f08e6-fa09-470b-b3c0-42b296424886","Type":"ContainerStarted","Data":"a338a5a7b362996c3142fb43a78365c76eff655dc9c201ef86eaf587f25217fd"} Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.477597 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-46hcc" podStartSLOduration=2.477561511 podStartE2EDuration="2.477561511s" podCreationTimestamp="2026-03-20 13:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:36.449704603 +0000 UTC m=+1135.959423569" watchObservedRunningTime="2026-03-20 13:40:36.477561511 +0000 UTC m=+1135.987280477" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.545591 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:40:36 crc kubenswrapper[4895]: W0320 13:40:36.571061 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676abdd9_331c_4d23_817d_a608c366a737.slice/crio-960d93cc4aa50b28d19a4b2f7932591dec14f9441cd16cfbcbe9ce4414147a2c WatchSource:0}: Error finding container 960d93cc4aa50b28d19a4b2f7932591dec14f9441cd16cfbcbe9ce4414147a2c: Status 404 returned error can't find the container with id 960d93cc4aa50b28d19a4b2f7932591dec14f9441cd16cfbcbe9ce4414147a2c Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.604655 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.604695 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.788863 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.903768 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb\") pod \"ec5f08e6-fa09-470b-b3c0-42b296424886\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.904161 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc\") pod \"ec5f08e6-fa09-470b-b3c0-42b296424886\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.904204 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbqw5\" (UniqueName: \"kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5\") pod \"ec5f08e6-fa09-470b-b3c0-42b296424886\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.904235 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config\") pod \"ec5f08e6-fa09-470b-b3c0-42b296424886\" (UID: \"ec5f08e6-fa09-470b-b3c0-42b296424886\") " Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.915643 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5" (OuterVolumeSpecName: "kube-api-access-lbqw5") pod "ec5f08e6-fa09-470b-b3c0-42b296424886" (UID: "ec5f08e6-fa09-470b-b3c0-42b296424886"). InnerVolumeSpecName "kube-api-access-lbqw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.929617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config" (OuterVolumeSpecName: "config") pod "ec5f08e6-fa09-470b-b3c0-42b296424886" (UID: "ec5f08e6-fa09-470b-b3c0-42b296424886"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.933409 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec5f08e6-fa09-470b-b3c0-42b296424886" (UID: "ec5f08e6-fa09-470b-b3c0-42b296424886"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:36 crc kubenswrapper[4895]: I0320 13:40:36.936449 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec5f08e6-fa09-470b-b3c0-42b296424886" (UID: "ec5f08e6-fa09-470b-b3c0-42b296424886"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.005671 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.005700 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbqw5\" (UniqueName: \"kubernetes.io/projected/ec5f08e6-fa09-470b-b3c0-42b296424886-kube-api-access-lbqw5\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.005712 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.005720 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec5f08e6-fa09-470b-b3c0-42b296424886-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.408619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8z7w6" event={"ID":"3f72b56f-f910-4f15-86ec-982800c24df8","Type":"ContainerStarted","Data":"aea4fe2fdce6d9eac7aaf71fadfc9e5926f0902a60ac81188f68195f1037fdc0"} Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.408914 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.409985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"676abdd9-331c-4d23-817d-a608c366a737","Type":"ContainerStarted","Data":"960d93cc4aa50b28d19a4b2f7932591dec14f9441cd16cfbcbe9ce4414147a2c"} Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.413023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f27bbad-8a84-4902-8349-8c8724552442","Type":"ContainerStarted","Data":"27c97374a7acc4bbb9412bcb712b4beb51a28c4c64a1d5a4de424a262ffdba2f"} Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.413247 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.414972 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" event={"ID":"ec5f08e6-fa09-470b-b3c0-42b296424886","Type":"ContainerDied","Data":"a338a5a7b362996c3142fb43a78365c76eff655dc9c201ef86eaf587f25217fd"} Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.415020 4895 scope.go:117] "RemoveContainer" containerID="266bf49e248cb9d651b2696a46844e661bcde1c05550a585bed4b019d45ad25c" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.415025 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-kcwnl" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.428340 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-8z7w6" podStartSLOduration=2.428326384 podStartE2EDuration="2.428326384s" podCreationTimestamp="2026-03-20 13:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:37.427059272 +0000 UTC m=+1136.936778238" watchObservedRunningTime="2026-03-20 13:40:37.428326384 +0000 UTC m=+1136.938045350" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.452474 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.663449395 podStartE2EDuration="37.452457949s" podCreationTimestamp="2026-03-20 13:40:00 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.842699652 +0000 UTC m=+1110.352418618" lastFinishedPulling="2026-03-20 13:40:36.631708206 +0000 UTC m=+1136.141427172" observedRunningTime="2026-03-20 13:40:37.45170199 +0000 UTC m=+1136.961420976" watchObservedRunningTime="2026-03-20 13:40:37.452457949 +0000 UTC m=+1136.962176915" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.496490 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.502063 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-kcwnl"] Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.998131 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:40:37 crc kubenswrapper[4895]: I0320 13:40:37.998467 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:40:38 crc kubenswrapper[4895]: I0320 13:40:38.084241 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:40:38 crc kubenswrapper[4895]: I0320 13:40:38.404433 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:40:38 crc kubenswrapper[4895]: I0320 13:40:38.581521 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.221629 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5f08e6-fa09-470b-b3c0-42b296424886" path="/var/lib/kubelet/pods/ec5f08e6-fa09-470b-b3c0-42b296424886/volumes" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.237683 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.317863 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.440982 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d413e49a-6f03-44fc-87bf-f6b71efac9ad","Type":"ContainerStarted","Data":"aef577d7f8da2963b0ab5eec488ee3c05f472616e6b6a40807529040a8e7ab7d"} Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.444497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"676abdd9-331c-4d23-817d-a608c366a737","Type":"ContainerStarted","Data":"dea2e2095c339af80b8669aae55263b2695831907d9321ae75e17e28ff612271"} Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.444552 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"676abdd9-331c-4d23-817d-a608c366a737","Type":"ContainerStarted","Data":"31768c0f9b1a9bf7ba020bbbf8510c20142ae7bd8f98d63cf488df22e085855d"} Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.444922 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.464027 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.82838066 podStartE2EDuration="4.464008871s" podCreationTimestamp="2026-03-20 13:40:35 +0000 UTC" firstStartedPulling="2026-03-20 13:40:36.624901758 +0000 UTC m=+1136.134620724" lastFinishedPulling="2026-03-20 13:40:38.260529969 +0000 UTC m=+1137.770248935" observedRunningTime="2026-03-20 13:40:39.46237283 +0000 UTC m=+1138.972091796" watchObservedRunningTime="2026-03-20 13:40:39.464008871 +0000 UTC m=+1138.973727837" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.559697 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e0bd-account-create-update-gh5pj"] Mar 20 13:40:39 crc kubenswrapper[4895]: E0320 13:40:39.560094 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5f08e6-fa09-470b-b3c0-42b296424886" containerName="init" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.560112 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5f08e6-fa09-470b-b3c0-42b296424886" containerName="init" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.560352 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5f08e6-fa09-470b-b3c0-42b296424886" containerName="init" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.561043 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.562980 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.568015 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e0bd-account-create-update-gh5pj"] Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.624115 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kwm44"] Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.625335 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.631838 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kwm44"] Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.680058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmmx\" (UniqueName: \"kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.680445 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.781608 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.782127 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgbq\" (UniqueName: \"kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.782265 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmmx\" (UniqueName: \"kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.782516 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.783096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.803171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmmx\" (UniqueName: \"kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx\") pod \"placement-e0bd-account-create-update-gh5pj\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.884608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgbq\" (UniqueName: \"kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.884770 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.885440 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.888283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.909849 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgbq\" (UniqueName: \"kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq\") pod \"placement-db-create-kwm44\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " pod="openstack/placement-db-create-kwm44" Mar 20 13:40:39 crc kubenswrapper[4895]: I0320 13:40:39.939975 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kwm44" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.393621 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e0bd-account-create-update-gh5pj"] Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.521544 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kwm44"] Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.581901 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.582178 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-8z7w6" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="dnsmasq-dns" containerID="cri-o://aea4fe2fdce6d9eac7aaf71fadfc9e5926f0902a60ac81188f68195f1037fdc0" gracePeriod=10 Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.642270 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.643784 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.649020 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.808111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.808155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx826\" (UniqueName: \"kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.808186 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.808220 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.808281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.909781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.909898 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx826\" (UniqueName: \"kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.910037 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.910075 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.910141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.911026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.911217 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.911413 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.911670 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.942812 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx826\" (UniqueName: \"kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826\") pod \"dnsmasq-dns-b8fbc5445-9kqz5\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:40 crc kubenswrapper[4895]: I0320 13:40:40.960230 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.491925 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d413e49a-6f03-44fc-87bf-f6b71efac9ad","Type":"ContainerStarted","Data":"f7a65b2d178c43b5b24985413959e50beb09c6c7d2053dce00543e21fe175f59"} Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.492643 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.506437 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.509065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8z7w6" event={"ID":"3f72b56f-f910-4f15-86ec-982800c24df8","Type":"ContainerDied","Data":"aea4fe2fdce6d9eac7aaf71fadfc9e5926f0902a60ac81188f68195f1037fdc0"} Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.509144 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f72b56f-f910-4f15-86ec-982800c24df8" containerID="aea4fe2fdce6d9eac7aaf71fadfc9e5926f0902a60ac81188f68195f1037fdc0" exitCode=0 Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.524338 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=14.092114155 podStartE2EDuration="41.524321086s" podCreationTimestamp="2026-03-20 13:40:00 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.832433459 +0000 UTC m=+1110.342152425" lastFinishedPulling="2026-03-20 13:40:38.2646404 +0000 UTC m=+1137.774359356" observedRunningTime="2026-03-20 13:40:41.521196459 +0000 UTC m=+1141.030915445" watchObservedRunningTime="2026-03-20 13:40:41.524321086 +0000 UTC m=+1141.034040052" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.771766 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.778706 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.785667 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-wmn9g" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.785964 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.786251 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.786523 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.798184 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961038 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-cache\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961082 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961151 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzh45\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-kube-api-access-gzh45\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961195 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-lock\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:41 crc kubenswrapper[4895]: I0320 13:40:41.961379 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063370 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-cache\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063424 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063470 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzh45\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-kube-api-access-gzh45\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-lock\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.063649 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.063680 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.063744 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:42.563724312 +0000 UTC m=+1142.073443278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.063844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-cache\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.064045 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-lock\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.065623 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.065649 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d294064a5fba24ec5d2791e9e048131555e6f677c3c482a192ef8e1439a66915/globalmount\"" pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.072148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.082107 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzh45\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-kube-api-access-gzh45\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.093512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a4fb0c9-82ac-4166-abd7-a48049f3fcfe\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: I0320 13:40:42.574998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.575167 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.575189 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:42 crc kubenswrapper[4895]: E0320 13:40:42.575246 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:43.575230011 +0000 UTC m=+1143.084948977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:43 crc kubenswrapper[4895]: I0320 13:40:43.424562 4895 scope.go:117] "RemoveContainer" containerID="6acb94277af9cadcf887066793e5e4239f8aab422e70f38444289f147f2918bf" Mar 20 13:40:43 crc kubenswrapper[4895]: W0320 13:40:43.521686 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb98a9bf6_0f6e_410a_8d4b_10bfea0d3b60.slice/crio-b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb WatchSource:0}: Error finding container b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb: Status 404 returned error can't find the container with id b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb Mar 20 13:40:43 crc kubenswrapper[4895]: W0320 13:40:43.531147 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d9d49d_3f8e_4d52_9824_2f74e592d3ae.slice/crio-0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d WatchSource:0}: Error finding container 0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d: Status 404 returned error can't find the container with id 0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d Mar 20 13:40:43 crc kubenswrapper[4895]: I0320 13:40:43.554374 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:40:43 crc kubenswrapper[4895]: I0320 13:40:43.599372 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:43 crc kubenswrapper[4895]: E0320 13:40:43.599548 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:43 crc kubenswrapper[4895]: E0320 13:40:43.599569 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:43 crc kubenswrapper[4895]: E0320 13:40:43.599610 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:45.59959443 +0000 UTC m=+1145.109313396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:43 crc kubenswrapper[4895]: I0320 13:40:43.865030 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.011012 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989jj\" (UniqueName: \"kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj\") pod \"3f72b56f-f910-4f15-86ec-982800c24df8\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.011341 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc\") pod \"3f72b56f-f910-4f15-86ec-982800c24df8\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.011424 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config\") pod \"3f72b56f-f910-4f15-86ec-982800c24df8\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.011578 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb\") pod \"3f72b56f-f910-4f15-86ec-982800c24df8\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.011682 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb\") pod \"3f72b56f-f910-4f15-86ec-982800c24df8\" (UID: \"3f72b56f-f910-4f15-86ec-982800c24df8\") " Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.014654 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj" (OuterVolumeSpecName: "kube-api-access-989jj") pod "3f72b56f-f910-4f15-86ec-982800c24df8" (UID: "3f72b56f-f910-4f15-86ec-982800c24df8"). InnerVolumeSpecName "kube-api-access-989jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.061014 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f72b56f-f910-4f15-86ec-982800c24df8" (UID: "3f72b56f-f910-4f15-86ec-982800c24df8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.062114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f72b56f-f910-4f15-86ec-982800c24df8" (UID: "3f72b56f-f910-4f15-86ec-982800c24df8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.063977 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f72b56f-f910-4f15-86ec-982800c24df8" (UID: "3f72b56f-f910-4f15-86ec-982800c24df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.064370 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config" (OuterVolumeSpecName: "config") pod "3f72b56f-f910-4f15-86ec-982800c24df8" (UID: "3f72b56f-f910-4f15-86ec-982800c24df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.120559 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.120597 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.120609 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989jj\" (UniqueName: \"kubernetes.io/projected/3f72b56f-f910-4f15-86ec-982800c24df8-kube-api-access-989jj\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.120622 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.120631 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f72b56f-f910-4f15-86ec-982800c24df8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.128229 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:40:44 crc kubenswrapper[4895]: W0320 13:40:44.211644 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7484f3c6_ea94_407e_a221_0386705a5caa.slice/crio-e9e896cc6577a4fa89f9a8b90e94f661023412d333d5543c9d452db662e7d3ab WatchSource:0}: Error finding container e9e896cc6577a4fa89f9a8b90e94f661023412d333d5543c9d452db662e7d3ab: Status 404 returned error can't find the container with id e9e896cc6577a4fa89f9a8b90e94f661023412d333d5543c9d452db662e7d3ab Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.533156 4895 generic.go:334] "Generic (PLEG): container finished" podID="e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" containerID="530ebdf7ad28d0091f593b288034bec6e31f15676adc713f1f19cc87dd623256" exitCode=0 Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.533322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0bd-account-create-update-gh5pj" event={"ID":"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae","Type":"ContainerDied","Data":"530ebdf7ad28d0091f593b288034bec6e31f15676adc713f1f19cc87dd623256"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.536291 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0bd-account-create-update-gh5pj" event={"ID":"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae","Type":"ContainerStarted","Data":"0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.538997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8z7w6" event={"ID":"3f72b56f-f910-4f15-86ec-982800c24df8","Type":"ContainerDied","Data":"c22436ca4cc7faefc4615b01c88ef8e1703819a18114e0cf0d17de2a649d166e"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.539042 4895 scope.go:117] "RemoveContainer" containerID="aea4fe2fdce6d9eac7aaf71fadfc9e5926f0902a60ac81188f68195f1037fdc0" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.539157 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8z7w6" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.557163 4895 generic.go:334] "Generic (PLEG): container finished" podID="b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" containerID="c2bd7abd45a1eac6a4fed95e69e392b11e558cd4b5288c5b73472e95783da3b4" exitCode=0 Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.557407 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kwm44" event={"ID":"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60","Type":"ContainerDied","Data":"c2bd7abd45a1eac6a4fed95e69e392b11e558cd4b5288c5b73472e95783da3b4"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.557519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kwm44" event={"ID":"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60","Type":"ContainerStarted","Data":"b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.565840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerStarted","Data":"50e0940a55bf4e54ccd717d8a358f4f6094480f24432da73f96c2e0c906a6b15"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.569894 4895 generic.go:334] "Generic (PLEG): container finished" podID="7484f3c6-ea94-407e-a221-0386705a5caa" containerID="144c81ee2b61e5cecf8e2e43c64e4a23f44336604f9cbe1761503cd39fca1fdd" exitCode=0 Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.569963 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" event={"ID":"7484f3c6-ea94-407e-a221-0386705a5caa","Type":"ContainerDied","Data":"144c81ee2b61e5cecf8e2e43c64e4a23f44336604f9cbe1761503cd39fca1fdd"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.570004 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" event={"ID":"7484f3c6-ea94-407e-a221-0386705a5caa","Type":"ContainerStarted","Data":"e9e896cc6577a4fa89f9a8b90e94f661023412d333d5543c9d452db662e7d3ab"} Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.596366 4895 scope.go:117] "RemoveContainer" containerID="f4b501ab06d91cfe73035e6bedbf91f380b3e4e674f4f09477333243de34af10" Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.640985 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:44 crc kubenswrapper[4895]: I0320 13:40:44.648232 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8z7w6"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.232979 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" path="/var/lib/kubelet/pods/3f72b56f-f910-4f15-86ec-982800c24df8/volumes" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.234349 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sc5bx"] Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.234861 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="dnsmasq-dns" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.234890 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="dnsmasq-dns" Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.234934 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="init" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.234947 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="init" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.235246 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f72b56f-f910-4f15-86ec-982800c24df8" containerName="dnsmasq-dns" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.236660 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sc5bx"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.236777 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.262439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.262500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlvr\" (UniqueName: \"kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.264189 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.364213 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.364276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btlvr\" (UniqueName: \"kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.365653 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.380512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btlvr\" (UniqueName: \"kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr\") pod \"root-account-create-update-sc5bx\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.591275 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" event={"ID":"7484f3c6-ea94-407e-a221-0386705a5caa","Type":"ContainerStarted","Data":"b9bc36f0c78e3717e6cbb07a5b2db29a25fb3162fe8a2b63a795f620f2058f5d"} Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.591768 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.606039 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.639878 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podStartSLOduration=5.639853631 podStartE2EDuration="5.639853631s" podCreationTimestamp="2026-03-20 13:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:45.625957299 +0000 UTC m=+1145.135676285" watchObservedRunningTime="2026-03-20 13:40:45.639853631 +0000 UTC m=+1145.149572607" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.671078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.671725 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.671745 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.671782 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:49.671765949 +0000 UTC m=+1149.181485005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.696654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dsdw5"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.723587 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.726503 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.726776 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.726816 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773754 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773803 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rb57\" (UniqueName: \"kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.773837 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.787066 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dsdw5"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.806545 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-dsdw5"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.812242 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2x97m"] Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.814290 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.819936 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2x97m"] Mar 20 13:40:45 crc kubenswrapper[4895]: E0320 13:40:45.838421 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-7rb57 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-dsdw5" podUID="20bd142d-1126-4830-80d8-01c7af17483f" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878110 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878161 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878189 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878223 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878243 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4v4z\" (UniqueName: \"kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878305 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878353 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878425 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878454 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rb57\" (UniqueName: \"kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878491 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878521 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.878544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.879335 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.879591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.880070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.952527 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.952537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.952704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.955234 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rb57\" (UniqueName: \"kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57\") pod \"swift-ring-rebalance-dsdw5\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980116 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980219 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4v4z\" (UniqueName: \"kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980358 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.980490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.981897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.982160 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.982470 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.984019 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.984835 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.985061 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:45 crc kubenswrapper[4895]: I0320 13:40:45.998282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4v4z\" (UniqueName: \"kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z\") pod \"swift-ring-rebalance-2x97m\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.075871 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kwm44" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.137807 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.173812 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.182715 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmmx\" (UniqueName: \"kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx\") pod \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.182817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts\") pod \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\" (UID: \"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.182879 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts\") pod \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.182963 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgbq\" (UniqueName: \"kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq\") pod \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\" (UID: \"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.184521 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" (UID: "e7d9d49d-3f8e-4d52-9824-2f74e592d3ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.184541 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" (UID: "b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.285452 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.285487 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.331941 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx" (OuterVolumeSpecName: "kube-api-access-vmmmx") pod "e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" (UID: "e7d9d49d-3f8e-4d52-9824-2f74e592d3ae"). InnerVolumeSpecName "kube-api-access-vmmmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.332533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq" (OuterVolumeSpecName: "kube-api-access-7wgbq") pod "b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" (UID: "b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60"). InnerVolumeSpecName "kube-api-access-7wgbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.360193 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sc5bx"] Mar 20 13:40:46 crc kubenswrapper[4895]: W0320 13:40:46.361321 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ecd98b_9bb8_472c_b4a1_04119e7c31e5.slice/crio-d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4 WatchSource:0}: Error finding container d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4: Status 404 returned error can't find the container with id d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4 Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.387799 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgbq\" (UniqueName: \"kubernetes.io/projected/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60-kube-api-access-7wgbq\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.387844 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmmx\" (UniqueName: \"kubernetes.io/projected/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae-kube-api-access-vmmmx\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.599994 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sc5bx" event={"ID":"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5","Type":"ContainerStarted","Data":"d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4"} Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.601559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kwm44" event={"ID":"b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60","Type":"ContainerDied","Data":"b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb"} Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.601584 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a02598312d3417583339baf5a4a99dadd61480ab7fecd0fbcafa78c09caeeb" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.601591 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kwm44" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.603659 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerStarted","Data":"a43dab9a9e09c5a0c42c56f927700548d51349304a9b7968c5a82afb4f832c21"} Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.606409 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.606428 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0bd-account-create-update-gh5pj" event={"ID":"e7d9d49d-3f8e-4d52-9824-2f74e592d3ae","Type":"ContainerDied","Data":"0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d"} Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.606467 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa9d9242d02168ae9537d9e57ba4f08ec95d92b58db06a8eea16dfab30d8d8d" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.606566 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0bd-account-create-update-gh5pj" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.616374 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693316 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693356 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rb57\" (UniqueName: \"kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693456 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693504 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693546 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693658 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts\") pod \"20bd142d-1126-4830-80d8-01c7af17483f\" (UID: \"20bd142d-1126-4830-80d8-01c7af17483f\") " Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.693997 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.694272 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts" (OuterVolumeSpecName: "scripts") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.694600 4895 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/20bd142d-1126-4830-80d8-01c7af17483f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.694622 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.695073 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.698324 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.698766 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57" (OuterVolumeSpecName: "kube-api-access-7rb57") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "kube-api-access-7rb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.698930 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.706332 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20bd142d-1126-4830-80d8-01c7af17483f" (UID: "20bd142d-1126-4830-80d8-01c7af17483f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.758849 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2x97m"] Mar 20 13:40:46 crc kubenswrapper[4895]: W0320 13:40:46.761375 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef2bf2f7_0cd4_4c17_8e27_a5c250fe761a.slice/crio-ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f WatchSource:0}: Error finding container ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f: Status 404 returned error can't find the container with id ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.796612 4895 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.796645 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rb57\" (UniqueName: \"kubernetes.io/projected/20bd142d-1126-4830-80d8-01c7af17483f-kube-api-access-7rb57\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.796654 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.796662 4895 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/20bd142d-1126-4830-80d8-01c7af17483f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:46 crc kubenswrapper[4895]: I0320 13:40:46.796671 4895 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/20bd142d-1126-4830-80d8-01c7af17483f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.542794 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk" Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.637563 4895 generic.go:334] "Generic (PLEG): container finished" podID="e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" containerID="8967f215925a8aefb9eaf99d8d0dbb9a601aa8ea42640204b072a606c7c7403f" exitCode=0 Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.637640 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sc5bx" event={"ID":"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5","Type":"ContainerDied","Data":"8967f215925a8aefb9eaf99d8d0dbb9a601aa8ea42640204b072a606c7c7403f"} Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.639789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dsdw5" Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.639783 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2x97m" event={"ID":"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a","Type":"ContainerStarted","Data":"ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f"} Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.711581 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-dsdw5"] Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.712699 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-ltb4d" Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.721597 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-dsdw5"] Mar 20 13:40:47 crc kubenswrapper[4895]: I0320 13:40:47.823226 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-h8n6z" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.567550 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-45kh2"] Mar 20 13:40:48 crc kubenswrapper[4895]: E0320 13:40:48.568283 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" containerName="mariadb-account-create-update" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.568307 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" containerName="mariadb-account-create-update" Mar 20 13:40:48 crc kubenswrapper[4895]: E0320 13:40:48.568324 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" containerName="mariadb-database-create" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.568333 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" containerName="mariadb-database-create" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.568633 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" containerName="mariadb-database-create" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.568658 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" containerName="mariadb-account-create-update" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.569486 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.577158 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-45kh2"] Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.670997 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9e0a-account-create-update-7gcwg"] Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.672199 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.674924 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e0a-account-create-update-7gcwg"] Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.703087 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.744466 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsm9\" (UniqueName: \"kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.745070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.810936 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="27c73d65-3dcb-44cb-a61e-004919dda8b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.825842 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.846860 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsm9\" (UniqueName: \"kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.846967 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.847013 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.847070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjbrl\" (UniqueName: \"kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.850747 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.877271 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsm9\" (UniqueName: \"kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9\") pod \"glance-db-create-45kh2\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.892417 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-45kh2" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.893342 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.950727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.950810 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjbrl\" (UniqueName: \"kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.952155 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:48 crc kubenswrapper[4895]: I0320 13:40:48.968014 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjbrl\" (UniqueName: \"kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl\") pod \"glance-9e0a-account-create-update-7gcwg\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.022009 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.222454 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20bd142d-1126-4830-80d8-01c7af17483f" path="/var/lib/kubelet/pods/20bd142d-1126-4830-80d8-01c7af17483f/volumes" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.311099 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7z6b6"] Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.312650 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.323046 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7z6b6"] Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.412441 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c7d5-account-create-update-tmrfb"] Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.413555 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.415805 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.420383 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c7d5-account-create-update-tmrfb"] Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.463372 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.463715 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75srk\" (UniqueName: \"kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.565406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.565466 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75srk\" (UniqueName: \"kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.565493 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br2lr\" (UniqueName: \"kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.565574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.566290 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.590158 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75srk\" (UniqueName: \"kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk\") pod \"keystone-db-create-7z6b6\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.652649 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.667131 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.667185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br2lr\" (UniqueName: \"kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.667831 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.685496 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br2lr\" (UniqueName: \"kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr\") pod \"keystone-c7d5-account-create-update-tmrfb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.737169 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:49 crc kubenswrapper[4895]: I0320 13:40:49.768665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:49 crc kubenswrapper[4895]: E0320 13:40:49.768836 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:49 crc kubenswrapper[4895]: E0320 13:40:49.768863 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:49 crc kubenswrapper[4895]: E0320 13:40:49.768927 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:57.768906581 +0000 UTC m=+1157.278625547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.414337 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.631957 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.668888 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sc5bx" event={"ID":"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5","Type":"ContainerDied","Data":"d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4"} Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.668937 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2c9f13134c4a9c8ae72bc08538c9f9124c77179e73f3174d2be12388f807fc4" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.669008 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sc5bx" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.789194 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btlvr\" (UniqueName: \"kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr\") pod \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.789325 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts\") pod \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\" (UID: \"e1ecd98b-9bb8-472c-b4a1-04119e7c31e5\") " Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.794795 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" (UID: "e1ecd98b-9bb8-472c-b4a1-04119e7c31e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.797662 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr" (OuterVolumeSpecName: "kube-api-access-btlvr") pod "e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" (UID: "e1ecd98b-9bb8-472c-b4a1-04119e7c31e5"). InnerVolumeSpecName "kube-api-access-btlvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.891363 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btlvr\" (UniqueName: \"kubernetes.io/projected/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-kube-api-access-btlvr\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.891429 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:50 crc kubenswrapper[4895]: I0320 13:40:50.962596 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.019695 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.019885 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="dnsmasq-dns" containerID="cri-o://8875685e2299356974f7dedeb06fff52da5f56f81e2137bf2f77f3edff84b683" gracePeriod=10 Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.677427 4895 generic.go:334] "Generic (PLEG): container finished" podID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerID="8875685e2299356974f7dedeb06fff52da5f56f81e2137bf2f77f3edff84b683" exitCode=0 Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.677480 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" event={"ID":"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb","Type":"ContainerDied","Data":"8875685e2299356974f7dedeb06fff52da5f56f81e2137bf2f77f3edff84b683"} Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.716384 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sc5bx"] Mar 20 13:40:51 crc kubenswrapper[4895]: I0320 13:40:51.723957 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sc5bx"] Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.085781 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.219127 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config\") pod \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.219419 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkjgm\" (UniqueName: \"kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm\") pod \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.219588 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc\") pod \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\" (UID: \"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb\") " Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.229097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm" (OuterVolumeSpecName: "kube-api-access-lkjgm") pod "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" (UID: "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb"). InnerVolumeSpecName "kube-api-access-lkjgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.251490 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-45kh2"] Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.272572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config" (OuterVolumeSpecName: "config") pod "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" (UID: "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.276017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" (UID: "e5914e04-ecb0-4e00-8b39-8fbc9abf6afb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.301056 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.301128 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.322099 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.322125 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.322138 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkjgm\" (UniqueName: \"kubernetes.io/projected/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb-kube-api-access-lkjgm\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.427879 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e0a-account-create-update-7gcwg"] Mar 20 13:40:52 crc kubenswrapper[4895]: W0320 13:40:52.429635 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd681d662_1e62_4fa1_bf4d_4e9740068509.slice/crio-249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe WatchSource:0}: Error finding container 249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe: Status 404 returned error can't find the container with id 249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.505120 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7z6b6"] Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.529069 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c7d5-account-create-update-tmrfb"] Mar 20 13:40:52 crc kubenswrapper[4895]: W0320 13:40:52.535476 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba653d5b_44a2_4eb4_9e1b_96e4a4c353cb.slice/crio-1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15 WatchSource:0}: Error finding container 1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15: Status 404 returned error can't find the container with id 1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15 Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.685194 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7z6b6" event={"ID":"cc3b9065-76cf-4c3c-b701-ae88033ddeef","Type":"ContainerStarted","Data":"320c135305b056c45bcdce4715e74f103d36bea3b13f54d03433d7d4727b57fb"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.685252 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7z6b6" event={"ID":"cc3b9065-76cf-4c3c-b701-ae88033ddeef","Type":"ContainerStarted","Data":"eb15e2b6d041b06be783f1b3860d3793a03b48ea081bd606dddb17cd122f5119"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.686162 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c7d5-account-create-update-tmrfb" event={"ID":"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb","Type":"ContainerStarted","Data":"0116942417cb73924596a9a0fd45b80fcbaf5fcf9ff322d28a0da38dc4c71b91"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.686188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c7d5-account-create-update-tmrfb" event={"ID":"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb","Type":"ContainerStarted","Data":"1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.688742 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerStarted","Data":"a231508d8cd5ac7863d4108e6696278bfdfcb4b6c62da73a706425d75841a01a"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.690224 4895 generic.go:334] "Generic (PLEG): container finished" podID="89345a71-c6db-4cc8-9ee9-120d5cfd4426" containerID="8b21f079ce682199b9e1c0a665e2c8c19820761ad991113b5189521ab297d381" exitCode=0 Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.690282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-45kh2" event={"ID":"89345a71-c6db-4cc8-9ee9-120d5cfd4426","Type":"ContainerDied","Data":"8b21f079ce682199b9e1c0a665e2c8c19820761ad991113b5189521ab297d381"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.690467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-45kh2" event={"ID":"89345a71-c6db-4cc8-9ee9-120d5cfd4426","Type":"ContainerStarted","Data":"eb8be2cd7783101c425382845aa65fb2dab160cb4cb82f36b4e8da938eedb34e"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.691870 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e0a-account-create-update-7gcwg" event={"ID":"d681d662-1e62-4fa1-bf4d-4e9740068509","Type":"ContainerStarted","Data":"027d9374dd6a66e14a1b75c1f10fc190f5792d9aeae857860a9b2132aa6b5f13"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.691906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e0a-account-create-update-7gcwg" event={"ID":"d681d662-1e62-4fa1-bf4d-4e9740068509","Type":"ContainerStarted","Data":"249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.693307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2x97m" event={"ID":"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a","Type":"ContainerStarted","Data":"2d0565f8ad976c968f37b843bfd12aab1866c1b24b8079e9840c1994b4375b4e"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.694886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" event={"ID":"e5914e04-ecb0-4e00-8b39-8fbc9abf6afb","Type":"ContainerDied","Data":"9e59724f11df6c39e6b0fde7233bbf7a058f67fc67c7366508864c8f64892bd5"} Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.694918 4895 scope.go:117] "RemoveContainer" containerID="8875685e2299356974f7dedeb06fff52da5f56f81e2137bf2f77f3edff84b683" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.695116 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zhqtz" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.717560 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7z6b6" podStartSLOduration=3.717538676 podStartE2EDuration="3.717538676s" podCreationTimestamp="2026-03-20 13:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:52.709882807 +0000 UTC m=+1152.219601773" watchObservedRunningTime="2026-03-20 13:40:52.717538676 +0000 UTC m=+1152.227257642" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.743206 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.741181775 podStartE2EDuration="52.74318359s" podCreationTimestamp="2026-03-20 13:40:00 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.842448736 +0000 UTC m=+1110.352167702" lastFinishedPulling="2026-03-20 13:40:51.844450551 +0000 UTC m=+1151.354169517" observedRunningTime="2026-03-20 13:40:52.734348781 +0000 UTC m=+1152.244067747" watchObservedRunningTime="2026-03-20 13:40:52.74318359 +0000 UTC m=+1152.252902556" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.764524 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9e0a-account-create-update-7gcwg" podStartSLOduration=4.764504356 podStartE2EDuration="4.764504356s" podCreationTimestamp="2026-03-20 13:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:52.749043884 +0000 UTC m=+1152.258762840" watchObservedRunningTime="2026-03-20 13:40:52.764504356 +0000 UTC m=+1152.274223322" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.766930 4895 scope.go:117] "RemoveContainer" containerID="f05c099156f9e6c72a91a336e8ac885c6498762e2dba67af8a19fa1cabe34194" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.815478 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2x97m" podStartSLOduration=2.7436746 podStartE2EDuration="7.815460543s" podCreationTimestamp="2026-03-20 13:40:45 +0000 UTC" firstStartedPulling="2026-03-20 13:40:46.763340098 +0000 UTC m=+1146.273059064" lastFinishedPulling="2026-03-20 13:40:51.835126041 +0000 UTC m=+1151.344845007" observedRunningTime="2026-03-20 13:40:52.799184342 +0000 UTC m=+1152.308903308" watchObservedRunningTime="2026-03-20 13:40:52.815460543 +0000 UTC m=+1152.325179509" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.819477 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c7d5-account-create-update-tmrfb" podStartSLOduration=3.819461412 podStartE2EDuration="3.819461412s" podCreationTimestamp="2026-03-20 13:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:52.809685021 +0000 UTC m=+1152.319403977" watchObservedRunningTime="2026-03-20 13:40:52.819461412 +0000 UTC m=+1152.329180378" Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.839498 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:40:52 crc kubenswrapper[4895]: I0320 13:40:52.846706 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zhqtz"] Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.221279 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" path="/var/lib/kubelet/pods/e1ecd98b-9bb8-472c-b4a1-04119e7c31e5/volumes" Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.221902 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" path="/var/lib/kubelet/pods/e5914e04-ecb0-4e00-8b39-8fbc9abf6afb/volumes" Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.707377 4895 generic.go:334] "Generic (PLEG): container finished" podID="cc3b9065-76cf-4c3c-b701-ae88033ddeef" containerID="320c135305b056c45bcdce4715e74f103d36bea3b13f54d03433d7d4727b57fb" exitCode=0 Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.707779 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7z6b6" event={"ID":"cc3b9065-76cf-4c3c-b701-ae88033ddeef","Type":"ContainerDied","Data":"320c135305b056c45bcdce4715e74f103d36bea3b13f54d03433d7d4727b57fb"} Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.710089 4895 generic.go:334] "Generic (PLEG): container finished" podID="d681d662-1e62-4fa1-bf4d-4e9740068509" containerID="027d9374dd6a66e14a1b75c1f10fc190f5792d9aeae857860a9b2132aa6b5f13" exitCode=0 Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.710217 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e0a-account-create-update-7gcwg" event={"ID":"d681d662-1e62-4fa1-bf4d-4e9740068509","Type":"ContainerDied","Data":"027d9374dd6a66e14a1b75c1f10fc190f5792d9aeae857860a9b2132aa6b5f13"} Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.713968 4895 generic.go:334] "Generic (PLEG): container finished" podID="ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" containerID="0116942417cb73924596a9a0fd45b80fcbaf5fcf9ff322d28a0da38dc4c71b91" exitCode=0 Mar 20 13:40:53 crc kubenswrapper[4895]: I0320 13:40:53.715007 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c7d5-account-create-update-tmrfb" event={"ID":"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb","Type":"ContainerDied","Data":"0116942417cb73924596a9a0fd45b80fcbaf5fcf9ff322d28a0da38dc4c71b91"} Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.151405 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-45kh2" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.256752 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts\") pod \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.256874 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsm9\" (UniqueName: \"kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9\") pod \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\" (UID: \"89345a71-c6db-4cc8-9ee9-120d5cfd4426\") " Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.257928 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89345a71-c6db-4cc8-9ee9-120d5cfd4426" (UID: "89345a71-c6db-4cc8-9ee9-120d5cfd4426"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.262262 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9" (OuterVolumeSpecName: "kube-api-access-hdsm9") pod "89345a71-c6db-4cc8-9ee9-120d5cfd4426" (UID: "89345a71-c6db-4cc8-9ee9-120d5cfd4426"). InnerVolumeSpecName "kube-api-access-hdsm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.359021 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsm9\" (UniqueName: \"kubernetes.io/projected/89345a71-c6db-4cc8-9ee9-120d5cfd4426-kube-api-access-hdsm9\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.359064 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89345a71-c6db-4cc8-9ee9-120d5cfd4426-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.726661 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-45kh2" event={"ID":"89345a71-c6db-4cc8-9ee9-120d5cfd4426","Type":"ContainerDied","Data":"eb8be2cd7783101c425382845aa65fb2dab160cb4cb82f36b4e8da938eedb34e"} Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.727022 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8be2cd7783101c425382845aa65fb2dab160cb4cb82f36b4e8da938eedb34e" Mar 20 13:40:54 crc kubenswrapper[4895]: I0320 13:40:54.726844 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-45kh2" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.209822 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.279459 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75srk\" (UniqueName: \"kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk\") pod \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.279515 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts\") pod \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\" (UID: \"cc3b9065-76cf-4c3c-b701-ae88033ddeef\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.280980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc3b9065-76cf-4c3c-b701-ae88033ddeef" (UID: "cc3b9065-76cf-4c3c-b701-ae88033ddeef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.285459 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk" (OuterVolumeSpecName: "kube-api-access-75srk") pod "cc3b9065-76cf-4c3c-b701-ae88033ddeef" (UID: "cc3b9065-76cf-4c3c-b701-ae88033ddeef"). InnerVolumeSpecName "kube-api-access-75srk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355353 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-84vsk"] Mar 20 13:40:55 crc kubenswrapper[4895]: E0320 13:40:55.355703 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3b9065-76cf-4c3c-b701-ae88033ddeef" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355719 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3b9065-76cf-4c3c-b701-ae88033ddeef" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: E0320 13:40:55.355730 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" containerName="mariadb-account-create-update" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355736 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" containerName="mariadb-account-create-update" Mar 20 13:40:55 crc kubenswrapper[4895]: E0320 13:40:55.355761 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="init" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355767 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="init" Mar 20 13:40:55 crc kubenswrapper[4895]: E0320 13:40:55.355778 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="dnsmasq-dns" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355784 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="dnsmasq-dns" Mar 20 13:40:55 crc kubenswrapper[4895]: E0320 13:40:55.355796 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89345a71-c6db-4cc8-9ee9-120d5cfd4426" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355802 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="89345a71-c6db-4cc8-9ee9-120d5cfd4426" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355954 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5914e04-ecb0-4e00-8b39-8fbc9abf6afb" containerName="dnsmasq-dns" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355965 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="89345a71-c6db-4cc8-9ee9-120d5cfd4426" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355974 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ecd98b-9bb8-472c-b4a1-04119e7c31e5" containerName="mariadb-account-create-update" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.355984 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc3b9065-76cf-4c3c-b701-ae88033ddeef" containerName="mariadb-database-create" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.356665 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-84vsk"] Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.356765 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.359359 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.365105 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.376462 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.381279 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc3b9065-76cf-4c3c-b701-ae88033ddeef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.381302 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75srk\" (UniqueName: \"kubernetes.io/projected/cc3b9065-76cf-4c3c-b701-ae88033ddeef-kube-api-access-75srk\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482305 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts\") pod \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482460 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjbrl\" (UniqueName: \"kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl\") pod \"d681d662-1e62-4fa1-bf4d-4e9740068509\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482500 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts\") pod \"d681d662-1e62-4fa1-bf4d-4e9740068509\" (UID: \"d681d662-1e62-4fa1-bf4d-4e9740068509\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482643 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br2lr\" (UniqueName: \"kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr\") pod \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\" (UID: \"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb\") " Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482798 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" (UID: "ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.482889 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6qx\" (UniqueName: \"kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.483057 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.483154 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.483264 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d681d662-1e62-4fa1-bf4d-4e9740068509" (UID: "d681d662-1e62-4fa1-bf4d-4e9740068509"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.486278 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr" (OuterVolumeSpecName: "kube-api-access-br2lr") pod "ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" (UID: "ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb"). InnerVolumeSpecName "kube-api-access-br2lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.487283 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl" (OuterVolumeSpecName: "kube-api-access-sjbrl") pod "d681d662-1e62-4fa1-bf4d-4e9740068509" (UID: "d681d662-1e62-4fa1-bf4d-4e9740068509"). InnerVolumeSpecName "kube-api-access-sjbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.584983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.585089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6qx\" (UniqueName: \"kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.585195 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br2lr\" (UniqueName: \"kubernetes.io/projected/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb-kube-api-access-br2lr\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.585210 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjbrl\" (UniqueName: \"kubernetes.io/projected/d681d662-1e62-4fa1-bf4d-4e9740068509-kube-api-access-sjbrl\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.585220 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d681d662-1e62-4fa1-bf4d-4e9740068509-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.586261 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.602614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6qx\" (UniqueName: \"kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx\") pod \"root-account-create-update-84vsk\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.690527 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.738625 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e0a-account-create-update-7gcwg" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.738642 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e0a-account-create-update-7gcwg" event={"ID":"d681d662-1e62-4fa1-bf4d-4e9740068509","Type":"ContainerDied","Data":"249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe"} Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.738681 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249d88503f2b018ada1836b30162b74681575c9df204210165130498351507fe" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.746856 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7z6b6" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.747104 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7z6b6" event={"ID":"cc3b9065-76cf-4c3c-b701-ae88033ddeef","Type":"ContainerDied","Data":"eb15e2b6d041b06be783f1b3860d3793a03b48ea081bd606dddb17cd122f5119"} Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.747158 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb15e2b6d041b06be783f1b3860d3793a03b48ea081bd606dddb17cd122f5119" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.752137 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c7d5-account-create-update-tmrfb" event={"ID":"ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb","Type":"ContainerDied","Data":"1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15"} Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.752171 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1887d89896667869036040e4f424ce3d2d6f4035b879b325268bb353d0ea9e15" Mar 20 13:40:55 crc kubenswrapper[4895]: I0320 13:40:55.752243 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c7d5-account-create-update-tmrfb" Mar 20 13:40:56 crc kubenswrapper[4895]: I0320 13:40:56.110752 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:40:56 crc kubenswrapper[4895]: I0320 13:40:56.187690 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-84vsk"] Mar 20 13:40:56 crc kubenswrapper[4895]: I0320 13:40:56.775032 4895 generic.go:334] "Generic (PLEG): container finished" podID="2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" containerID="617002a65165215a129ed9952750302a16406334c1e8e358fe7833ca8dde6832" exitCode=0 Mar 20 13:40:56 crc kubenswrapper[4895]: I0320 13:40:56.775237 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84vsk" event={"ID":"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185","Type":"ContainerDied","Data":"617002a65165215a129ed9952750302a16406334c1e8e358fe7833ca8dde6832"} Mar 20 13:40:56 crc kubenswrapper[4895]: I0320 13:40:56.775475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84vsk" event={"ID":"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185","Type":"ContainerStarted","Data":"d7d75cf01e2cd3f35528af7d1824427430aa1ff92145584a152e005f2d5f01cf"} Mar 20 13:40:57 crc kubenswrapper[4895]: I0320 13:40:57.329082 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 13:40:57 crc kubenswrapper[4895]: I0320 13:40:57.839432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:40:57 crc kubenswrapper[4895]: E0320 13:40:57.840003 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:40:57 crc kubenswrapper[4895]: E0320 13:40:57.840109 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:40:57 crc kubenswrapper[4895]: E0320 13:40:57.840204 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift podName:a1dc57ca-aca1-4886-ba82-f2f4b73944a1 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:13.840184745 +0000 UTC m=+1173.349903721 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift") pod "swift-storage-0" (UID: "a1dc57ca-aca1-4886-ba82-f2f4b73944a1") : configmap "swift-ring-files" not found Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.253624 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.347644 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6qx\" (UniqueName: \"kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx\") pod \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.347840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts\") pod \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\" (UID: \"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185\") " Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.348682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" (UID: "2af9cc4f-cbe3-44a6-9d69-9ad9304a5185"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.358593 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx" (OuterVolumeSpecName: "kube-api-access-7s6qx") pod "2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" (UID: "2af9cc4f-cbe3-44a6-9d69-9ad9304a5185"). InnerVolumeSpecName "kube-api-access-7s6qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.450289 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6qx\" (UniqueName: \"kubernetes.io/projected/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-kube-api-access-7s6qx\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.450320 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.803968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84vsk" event={"ID":"2af9cc4f-cbe3-44a6-9d69-9ad9304a5185","Type":"ContainerDied","Data":"d7d75cf01e2cd3f35528af7d1824427430aa1ff92145584a152e005f2d5f01cf"} Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.804024 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d75cf01e2cd3f35528af7d1824427430aa1ff92145584a152e005f2d5f01cf" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.804028 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84vsk" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.840978 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="27c73d65-3dcb-44cb-a61e-004919dda8b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851212 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zwhhm"] Mar 20 13:40:58 crc kubenswrapper[4895]: E0320 13:40:58.851631 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d681d662-1e62-4fa1-bf4d-4e9740068509" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851654 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d681d662-1e62-4fa1-bf4d-4e9740068509" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: E0320 13:40:58.851687 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851697 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: E0320 13:40:58.851737 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851748 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851969 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d681d662-1e62-4fa1-bf4d-4e9740068509" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.851997 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.852014 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" containerName="mariadb-account-create-update" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.852749 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.866745 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zwhhm"] Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.867749 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54xh9" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.869335 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.960283 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfw7\" (UniqueName: \"kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.960499 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.960687 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:58 crc kubenswrapper[4895]: I0320 13:40:58.960845 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.063194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.063377 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.063501 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfw7\" (UniqueName: \"kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.063562 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.069852 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.070058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.079704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.084583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfw7\" (UniqueName: \"kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7\") pod \"glance-db-sync-zwhhm\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.182951 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zwhhm" Mar 20 13:40:59 crc kubenswrapper[4895]: I0320 13:40:59.250272 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4phvm" podUID="f0db633f-39ca-4915-ab69-a17d9140e31b" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:40:59 crc kubenswrapper[4895]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:40:59 crc kubenswrapper[4895]: > Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.787651 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zwhhm"] Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.817324 4895 generic.go:334] "Generic (PLEG): container finished" podID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerID="7f0ecc47a978afc25c2a7716be49f21ca24938da5b6654d45c79de22b1b4e5a1" exitCode=0 Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.817410 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerDied","Data":"7f0ecc47a978afc25c2a7716be49f21ca24938da5b6654d45c79de22b1b4e5a1"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.820791 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" containerID="2d0565f8ad976c968f37b843bfd12aab1866c1b24b8079e9840c1994b4375b4e" exitCode=0 Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.820898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2x97m" event={"ID":"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a","Type":"ContainerDied","Data":"2d0565f8ad976c968f37b843bfd12aab1866c1b24b8079e9840c1994b4375b4e"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.822682 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zwhhm" event={"ID":"87249cf1-602d-4f80-976a-bc7a59bd4cfd","Type":"ContainerStarted","Data":"ddc8fd2bfb2f0fe2a4b387ff8b9991fa68e500ff89cfb19726130c7579af78e1"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.824310 4895 generic.go:334] "Generic (PLEG): container finished" podID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerID="9aabfb7c063f6233a078d4c562ccf0edf17a494752a0173dd120f4d4b03ed45d" exitCode=0 Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:40:59.824341 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerDied","Data":"9aabfb7c063f6233a078d4c562ccf0edf17a494752a0173dd120f4d4b03ed45d"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.834931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerStarted","Data":"edef95c7aadc2de12b902612def468d8cf92db96635227593d1fc4c8cf48f79d"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.835186 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.837135 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerStarted","Data":"7697587e287762ce47515f74218184115e63cfb97792b8724a3bff895729b31a"} Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.837328 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.865509 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.01667626 podStartE2EDuration="1m7.865488455s" podCreationTimestamp="2026-03-20 13:39:53 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.840851347 +0000 UTC m=+1110.350570313" lastFinishedPulling="2026-03-20 13:40:22.689663542 +0000 UTC m=+1122.199382508" observedRunningTime="2026-03-20 13:41:00.859968338 +0000 UTC m=+1160.369687334" watchObservedRunningTime="2026-03-20 13:41:00.865488455 +0000 UTC m=+1160.375207421" Mar 20 13:41:00 crc kubenswrapper[4895]: I0320 13:41:00.888686 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.722485519 podStartE2EDuration="1m6.888669686s" podCreationTimestamp="2026-03-20 13:39:54 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.041491042 +0000 UTC m=+1109.551209998" lastFinishedPulling="2026-03-20 13:40:24.207675179 +0000 UTC m=+1123.717394165" observedRunningTime="2026-03-20 13:41:00.881572221 +0000 UTC m=+1160.391291227" watchObservedRunningTime="2026-03-20 13:41:00.888669686 +0000 UTC m=+1160.398388652" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.191983 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.223481 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4v4z\" (UniqueName: \"kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.223543 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.223569 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.226476 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.251180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z" (OuterVolumeSpecName: "kube-api-access-v4v4z") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "kube-api-access-v4v4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.256314 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.324916 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.324996 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.325039 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.325065 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf\") pod \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\" (UID: \"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a\") " Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.325714 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4v4z\" (UniqueName: \"kubernetes.io/projected/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-kube-api-access-v4v4z\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.325735 4895 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.325750 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.326061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.330879 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.353166 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.371565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts" (OuterVolumeSpecName: "scripts") pod "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" (UID: "ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.403892 4895 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podf341d72e-a04d-4f58-a7f9-bed0b19710ae"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podf341d72e-a04d-4f58-a7f9-bed0b19710ae] : Timed out while waiting for systemd to remove kubepods-besteffort-podf341d72e_a04d_4f58_a7f9_bed0b19710ae.slice" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.428119 4895 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.428153 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.428165 4895 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.428178 4895 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.726419 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-84vsk"] Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.734759 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-84vsk"] Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.845251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2x97m" event={"ID":"ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a","Type":"ContainerDied","Data":"ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f"} Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.845298 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8b1c1492fca40e8e59b8053bfbbb7f0db98f020ebf2997d9f432a4f3407e4f" Mar 20 13:41:01 crc kubenswrapper[4895]: I0320 13:41:01.845338 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2x97m" Mar 20 13:41:02 crc kubenswrapper[4895]: I0320 13:41:02.329530 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:02 crc kubenswrapper[4895]: I0320 13:41:02.332035 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:02 crc kubenswrapper[4895]: I0320 13:41:02.855776 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:03 crc kubenswrapper[4895]: I0320 13:41:03.222105 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af9cc4f-cbe3-44a6-9d69-9ad9304a5185" path="/var/lib/kubelet/pods/2af9cc4f-cbe3-44a6-9d69-9ad9304a5185/volumes" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.228013 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4phvm" podUID="f0db633f-39ca-4915-ab69-a17d9140e31b" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:41:04 crc kubenswrapper[4895]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:41:04 crc kubenswrapper[4895]: > Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.271904 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.281329 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mvskb" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.582437 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4phvm-config-jnvgt"] Mar 20 13:41:04 crc kubenswrapper[4895]: E0320 13:41:04.582814 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" containerName="swift-ring-rebalance" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.582831 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" containerName="swift-ring-rebalance" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.582979 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a" containerName="swift-ring-rebalance" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.583600 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.585967 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.595524 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4phvm-config-jnvgt"] Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683413 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683463 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683509 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdkg\" (UniqueName: \"kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683554 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683596 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.683770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785280 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785351 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdkg\" (UniqueName: \"kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785526 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785604 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785681 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.785752 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.786187 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.787556 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.806091 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdkg\" (UniqueName: \"kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg\") pod \"ovn-controller-4phvm-config-jnvgt\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:04 crc kubenswrapper[4895]: I0320 13:41:04.902669 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.470895 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4phvm-config-jnvgt"] Mar 20 13:41:05 crc kubenswrapper[4895]: W0320 13:41:05.480956 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe065488_57d5_49e0_9f46_73acfdd62e4f.slice/crio-b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f WatchSource:0}: Error finding container b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f: Status 404 returned error can't find the container with id b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.879214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm-config-jnvgt" event={"ID":"be065488-57d5-49e0-9f46-73acfdd62e4f","Type":"ContainerStarted","Data":"fae5152c45783fec6a8d79b16fb0f0570c62f212ef5dc69fac74017e2edd87bc"} Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.879479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm-config-jnvgt" event={"ID":"be065488-57d5-49e0-9f46-73acfdd62e4f","Type":"ContainerStarted","Data":"b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f"} Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.901984 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.902214 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" containerID="cri-o://50e0940a55bf4e54ccd717d8a358f4f6094480f24432da73f96c2e0c906a6b15" gracePeriod=600 Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.902247 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="thanos-sidecar" containerID="cri-o://a231508d8cd5ac7863d4108e6696278bfdfcb4b6c62da73a706425d75841a01a" gracePeriod=600 Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.902559 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="config-reloader" containerID="cri-o://a43dab9a9e09c5a0c42c56f927700548d51349304a9b7968c5a82afb4f832c21" gracePeriod=600 Mar 20 13:41:05 crc kubenswrapper[4895]: I0320 13:41:05.910980 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4phvm-config-jnvgt" podStartSLOduration=1.910966818 podStartE2EDuration="1.910966818s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:05.909620175 +0000 UTC m=+1165.419339141" watchObservedRunningTime="2026-03-20 13:41:05.910966818 +0000 UTC m=+1165.420685784" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.741971 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mgd6t"] Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.743369 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.755613 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mgd6t"] Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.756716 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.821133 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l5z\" (UniqueName: \"kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.821217 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.893304 4895 generic.go:334] "Generic (PLEG): container finished" podID="be065488-57d5-49e0-9f46-73acfdd62e4f" containerID="fae5152c45783fec6a8d79b16fb0f0570c62f212ef5dc69fac74017e2edd87bc" exitCode=0 Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.893873 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm-config-jnvgt" event={"ID":"be065488-57d5-49e0-9f46-73acfdd62e4f","Type":"ContainerDied","Data":"fae5152c45783fec6a8d79b16fb0f0570c62f212ef5dc69fac74017e2edd87bc"} Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900181 4895 generic.go:334] "Generic (PLEG): container finished" podID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerID="a231508d8cd5ac7863d4108e6696278bfdfcb4b6c62da73a706425d75841a01a" exitCode=0 Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900214 4895 generic.go:334] "Generic (PLEG): container finished" podID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerID="a43dab9a9e09c5a0c42c56f927700548d51349304a9b7968c5a82afb4f832c21" exitCode=0 Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900224 4895 generic.go:334] "Generic (PLEG): container finished" podID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerID="50e0940a55bf4e54ccd717d8a358f4f6094480f24432da73f96c2e0c906a6b15" exitCode=0 Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900253 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerDied","Data":"a231508d8cd5ac7863d4108e6696278bfdfcb4b6c62da73a706425d75841a01a"} Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900650 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerDied","Data":"a43dab9a9e09c5a0c42c56f927700548d51349304a9b7968c5a82afb4f832c21"} Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.900729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerDied","Data":"50e0940a55bf4e54ccd717d8a358f4f6094480f24432da73f96c2e0c906a6b15"} Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.922892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.923085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l5z\" (UniqueName: \"kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.924254 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:06 crc kubenswrapper[4895]: I0320 13:41:06.943031 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l5z\" (UniqueName: \"kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z\") pod \"root-account-create-update-mgd6t\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:07 crc kubenswrapper[4895]: I0320 13:41:07.071533 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:08 crc kubenswrapper[4895]: I0320 13:41:08.812522 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="27c73d65-3dcb-44cb-a61e-004919dda8b4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:41:09 crc kubenswrapper[4895]: I0320 13:41:09.224109 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4phvm" Mar 20 13:41:10 crc kubenswrapper[4895]: I0320 13:41:10.328701 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.388044 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.389077 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.448855 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449360 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449501 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.449617 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp8qq\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450712 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450768 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450822 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450872 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450905 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450937 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.450964 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.451001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khdkg\" (UniqueName: \"kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg\") pod \"be065488-57d5-49e0-9f46-73acfdd62e4f\" (UID: \"be065488-57d5-49e0-9f46-73acfdd62e4f\") " Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.458455 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg" (OuterVolumeSpecName: "kube-api-access-khdkg") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "kube-api-access-khdkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.459079 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.460758 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts" (OuterVolumeSpecName: "scripts") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.462674 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.462733 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run" (OuterVolumeSpecName: "var-run") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.463254 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.463718 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.465148 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq" (OuterVolumeSpecName: "kube-api-access-fp8qq") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "kube-api-access-fp8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.465427 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.469800 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out" (OuterVolumeSpecName: "config-out") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.471382 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config" (OuterVolumeSpecName: "config") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.471459 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "be065488-57d5-49e0-9f46-73acfdd62e4f" (UID: "be065488-57d5-49e0-9f46-73acfdd62e4f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: E0320 13:41:13.474015 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc podName:e484f448-cd78-4a38-bb24-6f3e82fc81ea nodeName:}" failed. No retries permitted until 2026-03-20 13:41:13.973994132 +0000 UTC m=+1173.483713098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "prometheus-metric-storage-db" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.477335 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.480346 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.490904 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config" (OuterVolumeSpecName: "web-config") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556061 4895 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-web-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556090 4895 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556099 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556109 4895 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556120 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556129 4895 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556140 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp8qq\" (UniqueName: \"kubernetes.io/projected/e484f448-cd78-4a38-bb24-6f3e82fc81ea-kube-api-access-fp8qq\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556154 4895 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e484f448-cd78-4a38-bb24-6f3e82fc81ea-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556165 4895 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be065488-57d5-49e0-9f46-73acfdd62e4f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556317 4895 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e484f448-cd78-4a38-bb24-6f3e82fc81ea-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556328 4895 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556337 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556345 4895 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be065488-57d5-49e0-9f46-73acfdd62e4f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556354 4895 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e484f448-cd78-4a38-bb24-6f3e82fc81ea-config-out\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.556362 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khdkg\" (UniqueName: \"kubernetes.io/projected/be065488-57d5-49e0-9f46-73acfdd62e4f-kube-api-access-khdkg\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:13 crc kubenswrapper[4895]: W0320 13:41:13.684565 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod646f3cd6_24a0_4418_83ea_7a1f6e9b3654.slice/crio-497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c WatchSource:0}: Error finding container 497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c: Status 404 returned error can't find the container with id 497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.689160 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mgd6t"] Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.866481 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.874042 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a1dc57ca-aca1-4886-ba82-f2f4b73944a1-etc-swift\") pod \"swift-storage-0\" (UID: \"a1dc57ca-aca1-4886-ba82-f2f4b73944a1\") " pod="openstack/swift-storage-0" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.926188 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.967936 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mgd6t" event={"ID":"646f3cd6-24a0-4418-83ea-7a1f6e9b3654","Type":"ContainerStarted","Data":"b9a01c73df8552b87832259fd22d38089d241fa9a6f5bb3feea4bb455f727f0c"} Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.968001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mgd6t" event={"ID":"646f3cd6-24a0-4418-83ea-7a1f6e9b3654","Type":"ContainerStarted","Data":"497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c"} Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.970241 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zwhhm" event={"ID":"87249cf1-602d-4f80-976a-bc7a59bd4cfd","Type":"ContainerStarted","Data":"aabdf34cff7072c7ff33eff4f7c984dbe99a55804c109d098cb8b9a1fda59a3b"} Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.972010 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4phvm-config-jnvgt" event={"ID":"be065488-57d5-49e0-9f46-73acfdd62e4f","Type":"ContainerDied","Data":"b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f"} Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.972039 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f669e5338ef4d48aa6a595c04b5c58681b6ee98ed27f975deaadf5a6bfb49f" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.972161 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4phvm-config-jnvgt" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.975652 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e484f448-cd78-4a38-bb24-6f3e82fc81ea","Type":"ContainerDied","Data":"c78eda001f0e711a8e0d3121a22d10d8539ca73e0c16b1edd860fc76c4d5b120"} Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.975683 4895 scope.go:117] "RemoveContainer" containerID="a231508d8cd5ac7863d4108e6696278bfdfcb4b6c62da73a706425d75841a01a" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.975767 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:13 crc kubenswrapper[4895]: I0320 13:41:13.993821 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-mgd6t" podStartSLOduration=7.993787394 podStartE2EDuration="7.993787394s" podCreationTimestamp="2026-03-20 13:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:13.992715049 +0000 UTC m=+1173.502434015" watchObservedRunningTime="2026-03-20 13:41:13.993787394 +0000 UTC m=+1173.503506360" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.015256 4895 scope.go:117] "RemoveContainer" containerID="a43dab9a9e09c5a0c42c56f927700548d51349304a9b7968c5a82afb4f832c21" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.020785 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zwhhm" podStartSLOduration=2.530305078 podStartE2EDuration="16.020658008s" podCreationTimestamp="2026-03-20 13:40:58 +0000 UTC" firstStartedPulling="2026-03-20 13:40:59.794981625 +0000 UTC m=+1159.304700611" lastFinishedPulling="2026-03-20 13:41:13.285334565 +0000 UTC m=+1172.795053541" observedRunningTime="2026-03-20 13:41:14.018693804 +0000 UTC m=+1173.528412780" watchObservedRunningTime="2026-03-20 13:41:14.020658008 +0000 UTC m=+1173.530376974" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.050132 4895 scope.go:117] "RemoveContainer" containerID="50e0940a55bf4e54ccd717d8a358f4f6094480f24432da73f96c2e0c906a6b15" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.070162 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\" (UID: \"e484f448-cd78-4a38-bb24-6f3e82fc81ea\") " Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.087210 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e484f448-cd78-4a38-bb24-6f3e82fc81ea" (UID: "e484f448-cd78-4a38-bb24-6f3e82fc81ea"). InnerVolumeSpecName "pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.114050 4895 scope.go:117] "RemoveContainer" containerID="1f2c8768a0a5dc360d597d5566e03e8d30ef9148aaf6f8c528cbaa28af5faa51" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.171707 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") on node \"crc\" " Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.190715 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.190845 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc") on node "crc" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.239974 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.248783 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266462 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:14 crc kubenswrapper[4895]: E0320 13:41:14.266812 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="init-config-reloader" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266826 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="init-config-reloader" Mar 20 13:41:14 crc kubenswrapper[4895]: E0320 13:41:14.266851 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be065488-57d5-49e0-9f46-73acfdd62e4f" containerName="ovn-config" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266858 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="be065488-57d5-49e0-9f46-73acfdd62e4f" containerName="ovn-config" Mar 20 13:41:14 crc kubenswrapper[4895]: E0320 13:41:14.266868 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="thanos-sidecar" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266874 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="thanos-sidecar" Mar 20 13:41:14 crc kubenswrapper[4895]: E0320 13:41:14.266882 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266888 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" Mar 20 13:41:14 crc kubenswrapper[4895]: E0320 13:41:14.266902 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="config-reloader" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.266907 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="config-reloader" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.267060 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="config-reloader" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.267081 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="be065488-57d5-49e0-9f46-73acfdd62e4f" containerName="ovn-config" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.267092 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.267100 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="thanos-sidecar" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.268637 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.273411 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276162 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-phbvs" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276350 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276612 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276646 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276821 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.276956 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.277055 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.277242 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.280093 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.319448 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.378466 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.378956 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379051 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb64\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-kube-api-access-pzb64\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379082 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379120 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379154 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379212 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be90380e-db54-4216-8972-507d8c538e4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379242 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379303 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379338 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.379365 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.472972 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481501 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481553 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481578 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481626 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be90380e-db54-4216-8972-507d8c538e4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481648 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481675 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481718 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481784 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481803 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481826 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.481862 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb64\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-kube-api-access-pzb64\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.483330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.486800 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.489375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.491201 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.491475 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/be90380e-db54-4216-8972-507d8c538e4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.492471 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.494017 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be90380e-db54-4216-8972-507d8c538e4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.495943 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.496034 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.496072 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ed9a86cf4c4d51ee7a5816741a7d45729f9ae3892a1ed9810e6048d991f055dd/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.496213 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.500607 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.506967 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4phvm-config-jnvgt"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.508293 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb64\" (UniqueName: \"kubernetes.io/projected/be90380e-db54-4216-8972-507d8c538e4b-kube-api-access-pzb64\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.508374 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/be90380e-db54-4216-8972-507d8c538e4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.515771 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4phvm-config-jnvgt"] Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.538465 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8c5f48-121e-41ad-ad30-2c541cec3afc\") pod \"prometheus-metric-storage-0\" (UID: \"be90380e-db54-4216-8972-507d8c538e4b\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.588662 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.996609 4895 generic.go:334] "Generic (PLEG): container finished" podID="646f3cd6-24a0-4418-83ea-7a1f6e9b3654" containerID="b9a01c73df8552b87832259fd22d38089d241fa9a6f5bb3feea4bb455f727f0c" exitCode=0 Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.996670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mgd6t" event={"ID":"646f3cd6-24a0-4418-83ea-7a1f6e9b3654","Type":"ContainerDied","Data":"b9a01c73df8552b87832259fd22d38089d241fa9a6f5bb3feea4bb455f727f0c"} Mar 20 13:41:14 crc kubenswrapper[4895]: I0320 13:41:14.999121 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"56315ec4a8a6a66a6f9113520a04e041f09bbda18c4ea0063180264234f7b2d2"} Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.034521 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:41:15 crc kubenswrapper[4895]: W0320 13:41:15.035575 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe90380e_db54_4216_8972_507d8c538e4b.slice/crio-58b2c1c95b60a921ea4b736f62d0bf4ff91493bb71276a4595734a72b42fe84f WatchSource:0}: Error finding container 58b2c1c95b60a921ea4b736f62d0bf4ff91493bb71276a4595734a72b42fe84f: Status 404 returned error can't find the container with id 58b2c1c95b60a921ea4b736f62d0bf4ff91493bb71276a4595734a72b42fe84f Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.196573 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.241945 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be065488-57d5-49e0-9f46-73acfdd62e4f" path="/var/lib/kubelet/pods/be065488-57d5-49e0-9f46-73acfdd62e4f/volumes" Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.243987 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" path="/var/lib/kubelet/pods/e484f448-cd78-4a38-bb24-6f3e82fc81ea/volumes" Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.329124 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e484f448-cd78-4a38-bb24-6f3e82fc81ea" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:41:15 crc kubenswrapper[4895]: I0320 13:41:15.501606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.007542 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerStarted","Data":"58b2c1c95b60a921ea4b736f62d0bf4ff91493bb71276a4595734a72b42fe84f"} Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.405209 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.519811 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4l5z\" (UniqueName: \"kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z\") pod \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.520105 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts\") pod \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\" (UID: \"646f3cd6-24a0-4418-83ea-7a1f6e9b3654\") " Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.520775 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "646f3cd6-24a0-4418-83ea-7a1f6e9b3654" (UID: "646f3cd6-24a0-4418-83ea-7a1f6e9b3654"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.525373 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z" (OuterVolumeSpecName: "kube-api-access-h4l5z") pod "646f3cd6-24a0-4418-83ea-7a1f6e9b3654" (UID: "646f3cd6-24a0-4418-83ea-7a1f6e9b3654"). InnerVolumeSpecName "kube-api-access-h4l5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.622520 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:16 crc kubenswrapper[4895]: I0320 13:41:16.622562 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4l5z\" (UniqueName: \"kubernetes.io/projected/646f3cd6-24a0-4418-83ea-7a1f6e9b3654-kube-api-access-h4l5z\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.016730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mgd6t" event={"ID":"646f3cd6-24a0-4418-83ea-7a1f6e9b3654","Type":"ContainerDied","Data":"497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c"} Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.017125 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497839163fc6e460b14036ca8dfcf68f3553df2c26401dcf0d6b17a66c9ae05c" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.016954 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mgd6t" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.174241 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qvfkh"] Mar 20 13:41:17 crc kubenswrapper[4895]: E0320 13:41:17.174652 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646f3cd6-24a0-4418-83ea-7a1f6e9b3654" containerName="mariadb-account-create-update" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.174669 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="646f3cd6-24a0-4418-83ea-7a1f6e9b3654" containerName="mariadb-account-create-update" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.174846 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="646f3cd6-24a0-4418-83ea-7a1f6e9b3654" containerName="mariadb-account-create-update" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.175426 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.192182 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qvfkh"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.235518 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpjbx\" (UniqueName: \"kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.235601 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.338480 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpjbx\" (UniqueName: \"kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.338575 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.339276 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.380803 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpjbx\" (UniqueName: \"kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx\") pod \"cinder-db-create-qvfkh\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.402441 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0bb8-account-create-update-cjtrn"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.403692 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.420580 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0bb8-account-create-update-cjtrn"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.423550 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.495112 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.544126 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2nbv\" (UniqueName: \"kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.544489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.584260 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7cjkn"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.589222 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.599368 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7cjkn"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.647891 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m96v\" (UniqueName: \"kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.648018 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2nbv\" (UniqueName: \"kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.648041 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.648105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.649176 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.695343 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2nbv\" (UniqueName: \"kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv\") pod \"cinder-0bb8-account-create-update-cjtrn\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.699535 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-trwxf"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.715921 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trwxf"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.716237 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.718858 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6158-account-create-update-vthkt"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.721712 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.724241 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.732528 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6wxs8"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.733734 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.736485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.738716 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.739024 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.739223 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cfzps" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.739335 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.750117 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m96v\" (UniqueName: \"kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.750429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.751244 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.797999 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6158-account-create-update-vthkt"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.828132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m96v\" (UniqueName: \"kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v\") pod \"barbican-db-create-7cjkn\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867034 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867235 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867266 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867302 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkfx\" (UniqueName: \"kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867591 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mpc\" (UniqueName: \"kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.867622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnss\" (UniqueName: \"kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.903450 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6wxs8"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.949141 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.951440 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-whgpp"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.952754 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.964535 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-whgpp"] Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.969930 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.969965 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.969988 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkfx\" (UniqueName: \"kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.970008 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.970095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mpc\" (UniqueName: \"kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.970113 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnss\" (UniqueName: \"kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.970163 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.973880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:17 crc kubenswrapper[4895]: I0320 13:41:17.989189 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.007775 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mpc\" (UniqueName: \"kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc\") pod \"barbican-6158-account-create-update-vthkt\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.014151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.014901 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkfx\" (UniqueName: \"kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.018354 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data\") pod \"keystone-db-sync-6wxs8\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.034379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnss\" (UniqueName: \"kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss\") pod \"neutron-db-create-trwxf\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.050063 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e04e-account-create-update-pnxcr"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.051202 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.052055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerStarted","Data":"df3f83bf5801d9ec2e47fd4160a413833425b05d7943ff42d215f96a67fc8b87"} Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.054454 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.066174 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e04e-account-create-update-pnxcr"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.072858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.072916 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"833c590e26bfc7bece72fa5dc7b3b2f2bcc87745ae3a7fbccceb544713ad6488"} Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.072941 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scsv4\" (UniqueName: \"kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.108542 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-d2e6-account-create-update-glvld"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.109764 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.112892 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.119311 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-d2e6-account-create-update-glvld"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.131831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.182941 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.183622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.183772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pmf\" (UniqueName: \"kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.183945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.184111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzz5h\" (UniqueName: \"kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.184172 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.184244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scsv4\" (UniqueName: \"kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.196263 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.223855 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scsv4\" (UniqueName: \"kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4\") pod \"cloudkitty-db-create-whgpp\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.225218 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qvfkh"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.228124 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.286569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzz5h\" (UniqueName: \"kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.286694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.286765 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pmf\" (UniqueName: \"kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.286848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.287879 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.288268 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.303319 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzz5h\" (UniqueName: \"kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h\") pod \"neutron-e04e-account-create-update-pnxcr\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.304207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pmf\" (UniqueName: \"kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf\") pod \"cloudkitty-d2e6-account-create-update-glvld\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.312021 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.377475 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.441172 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.529215 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0bb8-account-create-update-cjtrn"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.632657 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7cjkn"] Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.814448 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 20 13:41:18 crc kubenswrapper[4895]: I0320 13:41:18.910792 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-trwxf"] Mar 20 13:41:18 crc kubenswrapper[4895]: W0320 13:41:18.913146 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4424f026_2489_4ade_bfc8_f2b711fede5d.slice/crio-c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323 WatchSource:0}: Error finding container c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323: Status 404 returned error can't find the container with id c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323 Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.057152 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6158-account-create-update-vthkt"] Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.090528 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6wxs8"] Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.098431 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7cjkn" event={"ID":"4f11268f-7299-492d-acbd-f04313e097d2","Type":"ContainerStarted","Data":"20db85ca9b6f0ff1600d2376f8a9a196c01090dd0e9f8d0c67570a1a802c7c64"} Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.101362 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0bb8-account-create-update-cjtrn" event={"ID":"0ef2dd7e-834a-4bdb-8947-bca7d65185da","Type":"ContainerStarted","Data":"0750fae7a235b1b88d958e72cc5df277cc06639d1cab4521103d1abf885dbf2f"} Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.104033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"fbd071ff715be63d9553ce47ce492649dd348a4f356d6ec0b71afb1790267fa9"} Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.108770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trwxf" event={"ID":"4424f026-2489-4ade-bfc8-f2b711fede5d","Type":"ContainerStarted","Data":"c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323"} Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.111941 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qvfkh" event={"ID":"df9c7cee-575a-4903-8c29-c977a78ac5f6","Type":"ContainerStarted","Data":"b9890558650730f9ab74e896250efa4fed0ec00860583a537f33f19467fcbe3d"} Mar 20 13:41:19 crc kubenswrapper[4895]: W0320 13:41:19.126551 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01eb0348_778d_4efb_a1fc_32c5f653526f.slice/crio-6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436 WatchSource:0}: Error finding container 6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436: Status 404 returned error can't find the container with id 6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436 Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.149484 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-whgpp"] Mar 20 13:41:19 crc kubenswrapper[4895]: W0320 13:41:19.277995 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode457c566_019a_4ce1_96ca_d1e4d1f8ff36.slice/crio-aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02 WatchSource:0}: Error finding container aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02: Status 404 returned error can't find the container with id aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02 Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.283699 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-d2e6-account-create-update-glvld"] Mar 20 13:41:19 crc kubenswrapper[4895]: I0320 13:41:19.283785 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e04e-account-create-update-pnxcr"] Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.127804 4895 generic.go:334] "Generic (PLEG): container finished" podID="01eb0348-778d-4efb-a1fc-32c5f653526f" containerID="6de5c55fb64f6debf441bd70ddd5ac62086452e2cc05af35a00ad82c2a11f20d" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.128054 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6158-account-create-update-vthkt" event={"ID":"01eb0348-778d-4efb-a1fc-32c5f653526f","Type":"ContainerDied","Data":"6de5c55fb64f6debf441bd70ddd5ac62086452e2cc05af35a00ad82c2a11f20d"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.128076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6158-account-create-update-vthkt" event={"ID":"01eb0348-778d-4efb-a1fc-32c5f653526f","Type":"ContainerStarted","Data":"6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.129813 4895 generic.go:334] "Generic (PLEG): container finished" podID="4f11268f-7299-492d-acbd-f04313e097d2" containerID="f54979f6b4cc777dc85e235bc931356de05f27358c94c8a8281c5a55676230e3" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.129872 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7cjkn" event={"ID":"4f11268f-7299-492d-acbd-f04313e097d2","Type":"ContainerDied","Data":"f54979f6b4cc777dc85e235bc931356de05f27358c94c8a8281c5a55676230e3"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.133621 4895 generic.go:334] "Generic (PLEG): container finished" podID="0ef2dd7e-834a-4bdb-8947-bca7d65185da" containerID="fe3846b2dc52b38399eacc3966352e405eb9814b9236f9f4089fa04ff0aaa3d4" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.133674 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0bb8-account-create-update-cjtrn" event={"ID":"0ef2dd7e-834a-4bdb-8947-bca7d65185da","Type":"ContainerDied","Data":"fe3846b2dc52b38399eacc3966352e405eb9814b9236f9f4089fa04ff0aaa3d4"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.135024 4895 generic.go:334] "Generic (PLEG): container finished" podID="3384f4a4-4c8a-4921-b17f-95f0568d32bc" containerID="f6cebf94a0f0945bf9181f0c02d805951ad9efe05aa2378d2b5b61f2ef3aa5f0" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.135065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" event={"ID":"3384f4a4-4c8a-4921-b17f-95f0568d32bc","Type":"ContainerDied","Data":"f6cebf94a0f0945bf9181f0c02d805951ad9efe05aa2378d2b5b61f2ef3aa5f0"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.135086 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" event={"ID":"3384f4a4-4c8a-4921-b17f-95f0568d32bc","Type":"ContainerStarted","Data":"60feb5c2974516e4bd13418e26823c2aeb0a579557e273b70e26d3b719b3f941"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.137514 4895 generic.go:334] "Generic (PLEG): container finished" podID="e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" containerID="ec545a7e6ec7e33e26fa18a787f99b14bb3d0f9b3d70aba16d0465ef64bd0895" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.137616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-whgpp" event={"ID":"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91","Type":"ContainerDied","Data":"ec545a7e6ec7e33e26fa18a787f99b14bb3d0f9b3d70aba16d0465ef64bd0895"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.137660 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-whgpp" event={"ID":"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91","Type":"ContainerStarted","Data":"a9ed7ee8544655042fcfd3eea0614c05905135d5ad5f5870b4c1a8b543aa8124"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.141624 4895 generic.go:334] "Generic (PLEG): container finished" podID="df9c7cee-575a-4903-8c29-c977a78ac5f6" containerID="38473f1938c42206a2af8d79bc350b6b7ffc283182278b6ef355a33d964ec4e4" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.141692 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qvfkh" event={"ID":"df9c7cee-575a-4903-8c29-c977a78ac5f6","Type":"ContainerDied","Data":"38473f1938c42206a2af8d79bc350b6b7ffc283182278b6ef355a33d964ec4e4"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.150120 4895 generic.go:334] "Generic (PLEG): container finished" podID="e457c566-019a-4ce1-96ca-d1e4d1f8ff36" containerID="2c6140c254c0e05435a4bc945323b53d2dd9a23780c4d58e3e879b969303f3a9" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.150185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e04e-account-create-update-pnxcr" event={"ID":"e457c566-019a-4ce1-96ca-d1e4d1f8ff36","Type":"ContainerDied","Data":"2c6140c254c0e05435a4bc945323b53d2dd9a23780c4d58e3e879b969303f3a9"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.150211 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e04e-account-create-update-pnxcr" event={"ID":"e457c566-019a-4ce1-96ca-d1e4d1f8ff36","Type":"ContainerStarted","Data":"aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.155870 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6wxs8" event={"ID":"958c9f6e-e716-42e2-bb6a-9d44847f4525","Type":"ContainerStarted","Data":"c0137200ea32044ca58ebb2d0b35d96095ffbb68b6f36141262aa5dfdbb8e2b4"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.164178 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"fe1fe223e53d33844377df6062c23a230d3af4631a55541cd0a6d6f0801333cf"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.164233 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"d4a4179f449e22ca4a0f6e857539334d781750989c7fb76c46db9bfbce7fad46"} Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.166636 4895 generic.go:334] "Generic (PLEG): container finished" podID="4424f026-2489-4ade-bfc8-f2b711fede5d" containerID="388b86e44b30393ecd25d8d99a467d833034f82e67c58ee3e5dc1934c217926e" exitCode=0 Mar 20 13:41:20 crc kubenswrapper[4895]: I0320 13:41:20.166692 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trwxf" event={"ID":"4424f026-2489-4ade-bfc8-f2b711fede5d","Type":"ContainerDied","Data":"388b86e44b30393ecd25d8d99a467d833034f82e67c58ee3e5dc1934c217926e"} Mar 20 13:41:21 crc kubenswrapper[4895]: I0320 13:41:21.183423 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"bc89ec84d267710082f4ee79651fce2df61c09cb604f6ff81924012422f0e4aa"} Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.196862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"200e812c137c8a1233440c3697da0d193fd7712ec43fa9c99ae0ce9794079655"} Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.298102 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.298186 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.360038 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.371959 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.374553 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385195 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnss\" (UniqueName: \"kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss\") pod \"4424f026-2489-4ade-bfc8-f2b711fede5d\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385234 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts\") pod \"4424f026-2489-4ade-bfc8-f2b711fede5d\" (UID: \"4424f026-2489-4ade-bfc8-f2b711fede5d\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385490 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts\") pod \"4f11268f-7299-492d-acbd-f04313e097d2\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385516 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m96v\" (UniqueName: \"kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v\") pod \"4f11268f-7299-492d-acbd-f04313e097d2\" (UID: \"4f11268f-7299-492d-acbd-f04313e097d2\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385533 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts\") pod \"df9c7cee-575a-4903-8c29-c977a78ac5f6\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.385564 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpjbx\" (UniqueName: \"kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx\") pod \"df9c7cee-575a-4903-8c29-c977a78ac5f6\" (UID: \"df9c7cee-575a-4903-8c29-c977a78ac5f6\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.387169 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.387792 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f11268f-7299-492d-acbd-f04313e097d2" (UID: "4f11268f-7299-492d-acbd-f04313e097d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.389547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4424f026-2489-4ade-bfc8-f2b711fede5d" (UID: "4424f026-2489-4ade-bfc8-f2b711fede5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.389724 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df9c7cee-575a-4903-8c29-c977a78ac5f6" (UID: "df9c7cee-575a-4903-8c29-c977a78ac5f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.393806 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx" (OuterVolumeSpecName: "kube-api-access-rpjbx") pod "df9c7cee-575a-4903-8c29-c977a78ac5f6" (UID: "df9c7cee-575a-4903-8c29-c977a78ac5f6"). InnerVolumeSpecName "kube-api-access-rpjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.393912 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss" (OuterVolumeSpecName: "kube-api-access-8tnss") pod "4424f026-2489-4ade-bfc8-f2b711fede5d" (UID: "4424f026-2489-4ade-bfc8-f2b711fede5d"). InnerVolumeSpecName "kube-api-access-8tnss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.393969 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v" (OuterVolumeSpecName: "kube-api-access-5m96v") pod "4f11268f-7299-492d-acbd-f04313e097d2" (UID: "4f11268f-7299-492d-acbd-f04313e097d2"). InnerVolumeSpecName "kube-api-access-5m96v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.405222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.477964 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.484832 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487110 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts\") pod \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487202 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts\") pod \"01eb0348-778d-4efb-a1fc-32c5f653526f\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487246 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts\") pod \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487297 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scsv4\" (UniqueName: \"kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4\") pod \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\" (UID: \"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487376 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7mpc\" (UniqueName: \"kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc\") pod \"01eb0348-778d-4efb-a1fc-32c5f653526f\" (UID: \"01eb0348-778d-4efb-a1fc-32c5f653526f\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487501 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2nbv\" (UniqueName: \"kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv\") pod \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\" (UID: \"0ef2dd7e-834a-4bdb-8947-bca7d65185da\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487783 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01eb0348-778d-4efb-a1fc-32c5f653526f" (UID: "01eb0348-778d-4efb-a1fc-32c5f653526f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.487839 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" (UID: "e67c072b-bfa6-4ddc-b8c6-ee96f07efc91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.488746 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ef2dd7e-834a-4bdb-8947-bca7d65185da" (UID: "0ef2dd7e-834a-4bdb-8947-bca7d65185da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.488833 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f11268f-7299-492d-acbd-f04313e097d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489040 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m96v\" (UniqueName: \"kubernetes.io/projected/4f11268f-7299-492d-acbd-f04313e097d2-kube-api-access-5m96v\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489114 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df9c7cee-575a-4903-8c29-c977a78ac5f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489170 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01eb0348-778d-4efb-a1fc-32c5f653526f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489220 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpjbx\" (UniqueName: \"kubernetes.io/projected/df9c7cee-575a-4903-8c29-c977a78ac5f6-kube-api-access-rpjbx\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489269 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489324 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnss\" (UniqueName: \"kubernetes.io/projected/4424f026-2489-4ade-bfc8-f2b711fede5d-kube-api-access-8tnss\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.489381 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4424f026-2489-4ade-bfc8-f2b711fede5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.491739 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc" (OuterVolumeSpecName: "kube-api-access-q7mpc") pod "01eb0348-778d-4efb-a1fc-32c5f653526f" (UID: "01eb0348-778d-4efb-a1fc-32c5f653526f"). InnerVolumeSpecName "kube-api-access-q7mpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.492074 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.492205 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4" (OuterVolumeSpecName: "kube-api-access-scsv4") pod "e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" (UID: "e67c072b-bfa6-4ddc-b8c6-ee96f07efc91"). InnerVolumeSpecName "kube-api-access-scsv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.494566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv" (OuterVolumeSpecName: "kube-api-access-t2nbv") pod "0ef2dd7e-834a-4bdb-8947-bca7d65185da" (UID: "0ef2dd7e-834a-4bdb-8947-bca7d65185da"). InnerVolumeSpecName "kube-api-access-t2nbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.590563 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts\") pod \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.590671 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pmf\" (UniqueName: \"kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf\") pod \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\" (UID: \"3384f4a4-4c8a-4921-b17f-95f0568d32bc\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.590692 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts\") pod \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.590859 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzz5h\" (UniqueName: \"kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h\") pod \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\" (UID: \"e457c566-019a-4ce1-96ca-d1e4d1f8ff36\") " Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.591222 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7mpc\" (UniqueName: \"kubernetes.io/projected/01eb0348-778d-4efb-a1fc-32c5f653526f-kube-api-access-q7mpc\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.591239 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2nbv\" (UniqueName: \"kubernetes.io/projected/0ef2dd7e-834a-4bdb-8947-bca7d65185da-kube-api-access-t2nbv\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.591249 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ef2dd7e-834a-4bdb-8947-bca7d65185da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.591283 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scsv4\" (UniqueName: \"kubernetes.io/projected/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91-kube-api-access-scsv4\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.592373 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e457c566-019a-4ce1-96ca-d1e4d1f8ff36" (UID: "e457c566-019a-4ce1-96ca-d1e4d1f8ff36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.592747 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3384f4a4-4c8a-4921-b17f-95f0568d32bc" (UID: "3384f4a4-4c8a-4921-b17f-95f0568d32bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.593987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf" (OuterVolumeSpecName: "kube-api-access-q8pmf") pod "3384f4a4-4c8a-4921-b17f-95f0568d32bc" (UID: "3384f4a4-4c8a-4921-b17f-95f0568d32bc"). InnerVolumeSpecName "kube-api-access-q8pmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.594705 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h" (OuterVolumeSpecName: "kube-api-access-fzz5h") pod "e457c566-019a-4ce1-96ca-d1e4d1f8ff36" (UID: "e457c566-019a-4ce1-96ca-d1e4d1f8ff36"). InnerVolumeSpecName "kube-api-access-fzz5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.693044 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzz5h\" (UniqueName: \"kubernetes.io/projected/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-kube-api-access-fzz5h\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.693099 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3384f4a4-4c8a-4921-b17f-95f0568d32bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.693119 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pmf\" (UniqueName: \"kubernetes.io/projected/3384f4a4-4c8a-4921-b17f-95f0568d32bc-kube-api-access-q8pmf\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:22 crc kubenswrapper[4895]: I0320 13:41:22.693137 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e457c566-019a-4ce1-96ca-d1e4d1f8ff36-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.205685 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7cjkn" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.205683 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7cjkn" event={"ID":"4f11268f-7299-492d-acbd-f04313e097d2","Type":"ContainerDied","Data":"20db85ca9b6f0ff1600d2376f8a9a196c01090dd0e9f8d0c67570a1a802c7c64"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.205819 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20db85ca9b6f0ff1600d2376f8a9a196c01090dd0e9f8d0c67570a1a802c7c64" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.209002 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0bb8-account-create-update-cjtrn" event={"ID":"0ef2dd7e-834a-4bdb-8947-bca7d65185da","Type":"ContainerDied","Data":"0750fae7a235b1b88d958e72cc5df277cc06639d1cab4521103d1abf885dbf2f"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.209026 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0750fae7a235b1b88d958e72cc5df277cc06639d1cab4521103d1abf885dbf2f" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.209184 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0bb8-account-create-update-cjtrn" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.211249 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e04e-account-create-update-pnxcr" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.212642 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.216577 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-whgpp" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.226288 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-trwxf" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.228092 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qvfkh" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.230358 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6158-account-create-update-vthkt" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e04e-account-create-update-pnxcr" event={"ID":"e457c566-019a-4ce1-96ca-d1e4d1f8ff36","Type":"ContainerDied","Data":"aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232757 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1d991f08c7bb249e212e9382894771b6367da19c353e75914bcf3f7725ac02" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232768 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-d2e6-account-create-update-glvld" event={"ID":"3384f4a4-4c8a-4921-b17f-95f0568d32bc","Type":"ContainerDied","Data":"60feb5c2974516e4bd13418e26823c2aeb0a579557e273b70e26d3b719b3f941"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232777 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60feb5c2974516e4bd13418e26823c2aeb0a579557e273b70e26d3b719b3f941" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"01e0ed01a619be010230d03700628da443cd72134ecc71f8409272c302468dc0"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232795 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-whgpp" event={"ID":"e67c072b-bfa6-4ddc-b8c6-ee96f07efc91","Type":"ContainerDied","Data":"a9ed7ee8544655042fcfd3eea0614c05905135d5ad5f5870b4c1a8b543aa8124"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232804 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ed7ee8544655042fcfd3eea0614c05905135d5ad5f5870b4c1a8b543aa8124" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232812 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-trwxf" event={"ID":"4424f026-2489-4ade-bfc8-f2b711fede5d","Type":"ContainerDied","Data":"c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232820 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0025d3611d87539d4b288b63e038331c9271250698816079d23bcf18c62c323" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qvfkh" event={"ID":"df9c7cee-575a-4903-8c29-c977a78ac5f6","Type":"ContainerDied","Data":"b9890558650730f9ab74e896250efa4fed0ec00860583a537f33f19467fcbe3d"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232839 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9890558650730f9ab74e896250efa4fed0ec00860583a537f33f19467fcbe3d" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6158-account-create-update-vthkt" event={"ID":"01eb0348-778d-4efb-a1fc-32c5f653526f","Type":"ContainerDied","Data":"6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436"} Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.232857 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6032149f4c62919899c39e4ce5a710004e4bc5b9739ad2b4ec10bc9301df4436" Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.234930 4895 generic.go:334] "Generic (PLEG): container finished" podID="be90380e-db54-4216-8972-507d8c538e4b" containerID="df3f83bf5801d9ec2e47fd4160a413833425b05d7943ff42d215f96a67fc8b87" exitCode=0 Mar 20 13:41:23 crc kubenswrapper[4895]: I0320 13:41:23.234971 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerDied","Data":"df3f83bf5801d9ec2e47fd4160a413833425b05d7943ff42d215f96a67fc8b87"} Mar 20 13:41:25 crc kubenswrapper[4895]: I0320 13:41:25.257361 4895 generic.go:334] "Generic (PLEG): container finished" podID="87249cf1-602d-4f80-976a-bc7a59bd4cfd" containerID="aabdf34cff7072c7ff33eff4f7c984dbe99a55804c109d098cb8b9a1fda59a3b" exitCode=0 Mar 20 13:41:25 crc kubenswrapper[4895]: I0320 13:41:25.257471 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zwhhm" event={"ID":"87249cf1-602d-4f80-976a-bc7a59bd4cfd","Type":"ContainerDied","Data":"aabdf34cff7072c7ff33eff4f7c984dbe99a55804c109d098cb8b9a1fda59a3b"} Mar 20 13:41:26 crc kubenswrapper[4895]: I0320 13:41:26.272439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6wxs8" event={"ID":"958c9f6e-e716-42e2-bb6a-9d44847f4525","Type":"ContainerStarted","Data":"9572e9d3da4b29d9fa2c08e844f4c2a9b44d437094ca2de97740ea956603724a"} Mar 20 13:41:26 crc kubenswrapper[4895]: I0320 13:41:26.275843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"39ccb1d0cc0a5c1dfa12a51290691f76bf1737fecbaf635272233e1a09152bee"} Mar 20 13:41:26 crc kubenswrapper[4895]: I0320 13:41:26.279564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerStarted","Data":"cde4091704fbad401549083114e99cff10e5a2e1ee22d8133a7147570ae23989"} Mar 20 13:41:26 crc kubenswrapper[4895]: I0320 13:41:26.297853 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6wxs8" podStartSLOduration=3.086776417 podStartE2EDuration="9.297833862s" podCreationTimestamp="2026-03-20 13:41:17 +0000 UTC" firstStartedPulling="2026-03-20 13:41:19.131664505 +0000 UTC m=+1178.641383471" lastFinishedPulling="2026-03-20 13:41:25.34272195 +0000 UTC m=+1184.852440916" observedRunningTime="2026-03-20 13:41:26.297416183 +0000 UTC m=+1185.807135159" watchObservedRunningTime="2026-03-20 13:41:26.297833862 +0000 UTC m=+1185.807552838" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.303531 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zwhhm" event={"ID":"87249cf1-602d-4f80-976a-bc7a59bd4cfd","Type":"ContainerDied","Data":"ddc8fd2bfb2f0fe2a4b387ff8b9991fa68e500ff89cfb19726130c7579af78e1"} Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.305413 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc8fd2bfb2f0fe2a4b387ff8b9991fa68e500ff89cfb19726130c7579af78e1" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.621137 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zwhhm" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.796010 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data\") pod \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.796364 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data\") pod \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.796544 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle\") pod \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.797066 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgfw7\" (UniqueName: \"kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7\") pod \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\" (UID: \"87249cf1-602d-4f80-976a-bc7a59bd4cfd\") " Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.801067 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7" (OuterVolumeSpecName: "kube-api-access-mgfw7") pod "87249cf1-602d-4f80-976a-bc7a59bd4cfd" (UID: "87249cf1-602d-4f80-976a-bc7a59bd4cfd"). InnerVolumeSpecName "kube-api-access-mgfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.831166 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87249cf1-602d-4f80-976a-bc7a59bd4cfd" (UID: "87249cf1-602d-4f80-976a-bc7a59bd4cfd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.878193 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data" (OuterVolumeSpecName: "config-data") pod "87249cf1-602d-4f80-976a-bc7a59bd4cfd" (UID: "87249cf1-602d-4f80-976a-bc7a59bd4cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.900447 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.900511 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgfw7\" (UniqueName: \"kubernetes.io/projected/87249cf1-602d-4f80-976a-bc7a59bd4cfd-kube-api-access-mgfw7\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.900541 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:27 crc kubenswrapper[4895]: I0320 13:41:27.955452 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87249cf1-602d-4f80-976a-bc7a59bd4cfd" (UID: "87249cf1-602d-4f80-976a-bc7a59bd4cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:28 crc kubenswrapper[4895]: I0320 13:41:28.002455 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87249cf1-602d-4f80-976a-bc7a59bd4cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:28 crc kubenswrapper[4895]: I0320 13:41:28.319483 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zwhhm" Mar 20 13:41:28 crc kubenswrapper[4895]: I0320 13:41:28.319475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"7661a4d6b3aac7cbe04e2831daeefc9fb84ba3b019a57713cf1bea45c9887a7f"} Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.100877 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101538 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e457c566-019a-4ce1-96ca-d1e4d1f8ff36" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101574 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e457c566-019a-4ce1-96ca-d1e4d1f8ff36" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101589 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101596 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101611 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01eb0348-778d-4efb-a1fc-32c5f653526f" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101618 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="01eb0348-778d-4efb-a1fc-32c5f653526f" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101651 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef2dd7e-834a-4bdb-8947-bca7d65185da" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101657 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef2dd7e-834a-4bdb-8947-bca7d65185da" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101671 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9c7cee-575a-4903-8c29-c977a78ac5f6" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101677 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9c7cee-575a-4903-8c29-c977a78ac5f6" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101687 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f11268f-7299-492d-acbd-f04313e097d2" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101693 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f11268f-7299-492d-acbd-f04313e097d2" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101730 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87249cf1-602d-4f80-976a-bc7a59bd4cfd" containerName="glance-db-sync" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101737 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="87249cf1-602d-4f80-976a-bc7a59bd4cfd" containerName="glance-db-sync" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101746 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3384f4a4-4c8a-4921-b17f-95f0568d32bc" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101753 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3384f4a4-4c8a-4921-b17f-95f0568d32bc" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: E0320 13:41:29.101764 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4424f026-2489-4ade-bfc8-f2b711fede5d" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101770 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4424f026-2489-4ade-bfc8-f2b711fede5d" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.101991 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f11268f-7299-492d-acbd-f04313e097d2" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.102012 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e457c566-019a-4ce1-96ca-d1e4d1f8ff36" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106897 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9c7cee-575a-4903-8c29-c977a78ac5f6" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106945 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef2dd7e-834a-4bdb-8947-bca7d65185da" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106958 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106971 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4424f026-2489-4ade-bfc8-f2b711fede5d" containerName="mariadb-database-create" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106982 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3384f4a4-4c8a-4921-b17f-95f0568d32bc" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.106990 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="01eb0348-778d-4efb-a1fc-32c5f653526f" containerName="mariadb-account-create-update" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.107028 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="87249cf1-602d-4f80-976a-bc7a59bd4cfd" containerName="glance-db-sync" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.113456 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.138935 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.225740 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.225814 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4kjs\" (UniqueName: \"kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.225855 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.225897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.225973 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.329333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.329447 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.329518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.329563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4kjs\" (UniqueName: \"kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.329589 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.331052 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.331572 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.332241 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.333266 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.355210 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"006fbfe9acb2dc2d24af1efb86a6704c70a8f56650083ad573ef3b544a3f9626"} Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.355280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"84e9b8d0ee71fff4cc07fa40e175a7415a51a99d585f98e88fadfe002d5d03be"} Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.359570 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4kjs\" (UniqueName: \"kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs\") pod \"dnsmasq-dns-74dc88fc-wf2w2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.360201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerStarted","Data":"0fb8a79f67aed6b4435bbea40c0c15977d2a4b7df366e872ee94f0a65499dbe7"} Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.360241 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"be90380e-db54-4216-8972-507d8c538e4b","Type":"ContainerStarted","Data":"9b3b1f0acd0f388559194f3d5cd4f4d713b0f0525e4bfeccb5838d2eae679418"} Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.395830 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.395807411 podStartE2EDuration="15.395807411s" podCreationTimestamp="2026-03-20 13:41:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:29.388972815 +0000 UTC m=+1188.898691771" watchObservedRunningTime="2026-03-20 13:41:29.395807411 +0000 UTC m=+1188.905526377" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.434337 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.593485 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.594574 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.600523 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:29 crc kubenswrapper[4895]: I0320 13:41:29.963421 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:29 crc kubenswrapper[4895]: W0320 13:41:29.965203 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb025d246_c7fb_4e5b_b2ba_3ecc468f5eb2.slice/crio-d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d WatchSource:0}: Error finding container d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d: Status 404 returned error can't find the container with id d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.375422 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"365d9de07b39288a6bbd4ff8773a195d3f7dfd319c041995ed18e5f282af8b61"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.375473 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"014fabf9b431e0ceb10eb82d58c0040cac04d9fc7fab995c394fbcb60775ccd2"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.375485 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"eb0315104bca4a4d202bfee1990fe1ff6bf51aa5c05152263a21d2d8cdc1ffee"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.375494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a1dc57ca-aca1-4886-ba82-f2f4b73944a1","Type":"ContainerStarted","Data":"5a1900e46d7113ed10dade8359be97403e368cf6d65f500e090d8dcf7ca1aa0b"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.376997 4895 generic.go:334] "Generic (PLEG): container finished" podID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerID="4d74b47199da609d5e4af7b760019a23f11e58eaa856c1d0a755ac299b0573f9" exitCode=0 Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.377057 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" event={"ID":"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2","Type":"ContainerDied","Data":"4d74b47199da609d5e4af7b760019a23f11e58eaa856c1d0a755ac299b0573f9"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.377074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" event={"ID":"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2","Type":"ContainerStarted","Data":"d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.380983 4895 generic.go:334] "Generic (PLEG): container finished" podID="958c9f6e-e716-42e2-bb6a-9d44847f4525" containerID="9572e9d3da4b29d9fa2c08e844f4c2a9b44d437094ca2de97740ea956603724a" exitCode=0 Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.381073 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6wxs8" event={"ID":"958c9f6e-e716-42e2-bb6a-9d44847f4525","Type":"ContainerDied","Data":"9572e9d3da4b29d9fa2c08e844f4c2a9b44d437094ca2de97740ea956603724a"} Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.386362 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.412329 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.551270906 podStartE2EDuration="50.412311008s" podCreationTimestamp="2026-03-20 13:40:40 +0000 UTC" firstStartedPulling="2026-03-20 13:41:14.488226106 +0000 UTC m=+1173.997945072" lastFinishedPulling="2026-03-20 13:41:27.349266208 +0000 UTC m=+1186.858985174" observedRunningTime="2026-03-20 13:41:30.403634349 +0000 UTC m=+1189.913353335" watchObservedRunningTime="2026-03-20 13:41:30.412311008 +0000 UTC m=+1189.922029964" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.820235 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.842222 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.843689 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.853137 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.859581 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.956509 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg5qr\" (UniqueName: \"kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.956594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.956858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.956908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.957022 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:30 crc kubenswrapper[4895]: I0320 13:41:30.957175 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059368 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059484 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059511 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059555 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059610 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.059655 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg5qr\" (UniqueName: \"kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.060553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.060743 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.061071 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.061523 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.061710 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.083145 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg5qr\" (UniqueName: \"kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr\") pod \"dnsmasq-dns-5f59b8f679-6ccpr\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.158742 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.397999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" event={"ID":"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2","Type":"ContainerStarted","Data":"412c2ec8e53ae785769748da243924f19737ec4340add704672428c36cf39b8b"} Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.457966 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" podStartSLOduration=2.457668685 podStartE2EDuration="2.457668685s" podCreationTimestamp="2026-03-20 13:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:31.427895534 +0000 UTC m=+1190.937614500" watchObservedRunningTime="2026-03-20 13:41:31.457668685 +0000 UTC m=+1190.967387651" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.722558 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:31 crc kubenswrapper[4895]: W0320 13:41:31.722983 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5738fec7_4f5b_40e9_82f6_d02d48d7a955.slice/crio-fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d WatchSource:0}: Error finding container fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d: Status 404 returned error can't find the container with id fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.887946 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.975970 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data\") pod \"958c9f6e-e716-42e2-bb6a-9d44847f4525\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.976030 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle\") pod \"958c9f6e-e716-42e2-bb6a-9d44847f4525\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.976083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmkfx\" (UniqueName: \"kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx\") pod \"958c9f6e-e716-42e2-bb6a-9d44847f4525\" (UID: \"958c9f6e-e716-42e2-bb6a-9d44847f4525\") " Mar 20 13:41:31 crc kubenswrapper[4895]: I0320 13:41:31.979974 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx" (OuterVolumeSpecName: "kube-api-access-mmkfx") pod "958c9f6e-e716-42e2-bb6a-9d44847f4525" (UID: "958c9f6e-e716-42e2-bb6a-9d44847f4525"). InnerVolumeSpecName "kube-api-access-mmkfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.025649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data" (OuterVolumeSpecName: "config-data") pod "958c9f6e-e716-42e2-bb6a-9d44847f4525" (UID: "958c9f6e-e716-42e2-bb6a-9d44847f4525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.031501 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "958c9f6e-e716-42e2-bb6a-9d44847f4525" (UID: "958c9f6e-e716-42e2-bb6a-9d44847f4525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.078691 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.078727 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c9f6e-e716-42e2-bb6a-9d44847f4525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.078738 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmkfx\" (UniqueName: \"kubernetes.io/projected/958c9f6e-e716-42e2-bb6a-9d44847f4525-kube-api-access-mmkfx\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.409470 4895 generic.go:334] "Generic (PLEG): container finished" podID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerID="5752368e1ffd0d3bd077828c3cb696bacd1311baf8f07e8b1be78a8a1c4bf9e8" exitCode=0 Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.409558 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" event={"ID":"5738fec7-4f5b-40e9-82f6-d02d48d7a955","Type":"ContainerDied","Data":"5752368e1ffd0d3bd077828c3cb696bacd1311baf8f07e8b1be78a8a1c4bf9e8"} Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.410127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" event={"ID":"5738fec7-4f5b-40e9-82f6-d02d48d7a955","Type":"ContainerStarted","Data":"fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d"} Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.412062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6wxs8" event={"ID":"958c9f6e-e716-42e2-bb6a-9d44847f4525","Type":"ContainerDied","Data":"c0137200ea32044ca58ebb2d0b35d96095ffbb68b6f36141262aa5dfdbb8e2b4"} Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.412133 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0137200ea32044ca58ebb2d0b35d96095ffbb68b6f36141262aa5dfdbb8e2b4" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.412628 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.412630 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6wxs8" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.413129 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="dnsmasq-dns" containerID="cri-o://412c2ec8e53ae785769748da243924f19737ec4340add704672428c36cf39b8b" gracePeriod=10 Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.712021 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dlpcl"] Mar 20 13:41:32 crc kubenswrapper[4895]: E0320 13:41:32.712418 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958c9f6e-e716-42e2-bb6a-9d44847f4525" containerName="keystone-db-sync" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.712433 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="958c9f6e-e716-42e2-bb6a-9d44847f4525" containerName="keystone-db-sync" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.712629 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="958c9f6e-e716-42e2-bb6a-9d44847f4525" containerName="keystone-db-sync" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.713300 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.721904 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.722083 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cfzps" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.722276 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.722489 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.722632 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.741662 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dlpcl"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.763228 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.791902 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.792204 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.792261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.792282 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.792363 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.792379 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfhs\" (UniqueName: \"kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.825197 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.827248 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.849994 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895581 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895616 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895692 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895753 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895799 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbcd\" (UniqueName: \"kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895870 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895933 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfhs\" (UniqueName: \"kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.895964 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.908174 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.905378 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-57bfc"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.908945 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.909915 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.910108 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.912884 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.918546 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6kpmz" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.918597 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.919156 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.924854 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.934164 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-57bfc"] Mar 20 13:41:32 crc kubenswrapper[4895]: I0320 13:41:32.938606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfhs\" (UniqueName: \"kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs\") pod \"keystone-bootstrap-dlpcl\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.017905 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.017956 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbcd\" (UniqueName: \"kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018018 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018053 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018087 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7gs\" (UniqueName: \"kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018180 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018203 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018230 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.018278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.033885 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.045095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.045342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.046603 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.096368 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.096991 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.106133 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbcd\" (UniqueName: \"kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd\") pod \"dnsmasq-dns-bbf5cc879-9crs6\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.128670 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt7gs\" (UniqueName: \"kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.130955 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.131424 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.131524 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.131682 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.131855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.133211 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.163243 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.163754 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.164296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.181473 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.184033 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt7gs\" (UniqueName: \"kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs\") pod \"cinder-db-sync-57bfc\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.189591 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.192238 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9nkgt"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.195551 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.233784 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ngd5w" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.233945 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.234060 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.291277 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9nkgt"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.295463 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.297690 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.323452 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.323656 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.336030 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-clr45"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.337258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.337948 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57bfc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.354157 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.354499 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rpmz6" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.355972 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8kk\" (UniqueName: \"kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356004 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvnb9\" (UniqueName: \"kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tghnc\" (UniqueName: \"kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356129 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356199 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356220 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356236 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356256 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356279 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356301 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.356318 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.357437 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.399517 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-clr45"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.455193 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-lh78p"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463330 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463420 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463485 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463506 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463523 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463542 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463565 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8kk\" (UniqueName: \"kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463666 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvnb9\" (UniqueName: \"kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463688 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tghnc\" (UniqueName: \"kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.463705 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.464188 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.464992 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.475231 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.485820 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pbltf" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.486022 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.489083 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.492447 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.494095 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.499979 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.505009 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.510636 4895 generic.go:334] "Generic (PLEG): container finished" podID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerID="412c2ec8e53ae785769748da243924f19737ec4340add704672428c36cf39b8b" exitCode=0 Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.510784 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-lh78p"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.510815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" event={"ID":"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2","Type":"ContainerDied","Data":"412c2ec8e53ae785769748da243924f19737ec4340add704672428c36cf39b8b"} Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.510838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" event={"ID":"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2","Type":"ContainerDied","Data":"d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d"} Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.510851 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e6eea4a5960a54cd932353318e4ef73e5c9538071792871aae6a975f9bf97d" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.512030 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.513124 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.513821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.514855 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" event={"ID":"5738fec7-4f5b-40e9-82f6-d02d48d7a955","Type":"ContainerStarted","Data":"ef58461765e9f8b579090cb930f74f94825b7ec2fbeba6897bc7f3647b750142"} Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.514971 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.514997 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="dnsmasq-dns" containerID="cri-o://ef58461765e9f8b579090cb930f74f94825b7ec2fbeba6897bc7f3647b750142" gracePeriod=10 Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.515033 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.515578 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvnb9\" (UniqueName: \"kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9\") pod \"barbican-db-sync-clr45\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.526051 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.530076 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tghnc\" (UniqueName: \"kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc\") pod \"ceilometer-0\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.538287 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8kk\" (UniqueName: \"kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk\") pod \"neutron-db-sync-9nkgt\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.619355 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.652191 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.669690 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.669771 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.671063 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.671113 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2bl\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.671234 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.672420 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-m79dc"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.678259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-clr45" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.680876 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.694494 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.694667 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.698061 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z768l" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.712939 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.779133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.779187 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.779266 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.779324 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.779356 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2bl\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.791057 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.806606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.808664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.812286 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2bl\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.838874 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle\") pod \"cloudkitty-db-sync-lh78p\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.875589 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m79dc"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.880365 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4kjs\" (UniqueName: \"kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs\") pod \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.880571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb\") pod \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.880696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config\") pod \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.880721 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc\") pod \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.880799 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb\") pod \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\" (UID: \"b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2\") " Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.881097 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.881130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.881170 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.881190 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzf7\" (UniqueName: \"kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.881233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.895928 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.909607 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs" (OuterVolumeSpecName: "kube-api-access-s4kjs") pod "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" (UID: "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2"). InnerVolumeSpecName "kube-api-access-s4kjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.960309 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:41:33 crc kubenswrapper[4895]: E0320 13:41:33.960819 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="dnsmasq-dns" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.960833 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="dnsmasq-dns" Mar 20 13:41:33 crc kubenswrapper[4895]: E0320 13:41:33.960863 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="init" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.960869 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="init" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.960916 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config" (OuterVolumeSpecName: "config") pod "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" (UID: "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.961082 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" containerName="dnsmasq-dns" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.962183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" (UID: "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.962361 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.970002 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.970378 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" podStartSLOduration=3.970360694 podStartE2EDuration="3.970360694s" podCreationTimestamp="2026-03-20 13:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:33.642468252 +0000 UTC m=+1193.152187218" watchObservedRunningTime="2026-03-20 13:41:33.970360694 +0000 UTC m=+1193.480079660" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.982980 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983207 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983229 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzf7\" (UniqueName: \"kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983277 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983288 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4kjs\" (UniqueName: \"kubernetes.io/projected/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-kube-api-access-s4kjs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983299 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.983695 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.986036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.988163 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.990809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" (UID: "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.992899 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" (UID: "b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:33 crc kubenswrapper[4895]: I0320 13:41:33.994232 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.000171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzf7\" (UniqueName: \"kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7\") pod \"placement-db-sync-m79dc\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.047432 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.049112 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.052680 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.052872 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-54xh9" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.053009 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.053133 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.056691 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.076245 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085175 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085450 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwsx\" (UniqueName: \"kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085546 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085577 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085616 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.085761 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.088652 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.088715 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.089654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m79dc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.090145 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.095743 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.096024 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.104100 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192543 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192576 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192670 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192735 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192920 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192951 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.192981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193015 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193086 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwsx\" (UniqueName: \"kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193109 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.193188 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.194408 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.194457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.194465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.194572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvqh\" (UniqueName: \"kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.194604 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcwd\" (UniqueName: \"kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.195503 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.197190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.197416 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.198086 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.214401 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwsx\" (UniqueName: \"kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx\") pod \"dnsmasq-dns-56df8fb6b7-5shnc\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.283436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296600 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296634 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296654 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296746 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296791 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296843 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296872 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvqh\" (UniqueName: \"kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296906 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcwd\" (UniqueName: \"kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.296947 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.297017 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.297049 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.297085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.297113 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.298455 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.300220 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.303630 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.311782 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.317014 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.317057 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/18ebb825098e65852293f2b0f63099f5113b6726c6c2675c80c59a63de5999b9/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.322810 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.322862 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b6443358e93f73fc16f0714d0b2e759c539b0d4c74a7e83912ba5d1f8dded95/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.324826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.334876 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcwd\" (UniqueName: \"kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.336626 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.336907 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.337488 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dlpcl"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.347647 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.348099 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.353609 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.355364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.355819 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.356089 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvqh\" (UniqueName: \"kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: W0320 13:41:34.366749 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef48bf1d_6c18_420b_a56f_c59e06e5a2ee.slice/crio-63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7 WatchSource:0}: Error finding container 63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7: Status 404 returned error can't find the container with id 63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7 Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.404716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.415438 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:34 crc kubenswrapper[4895]: W0320 13:41:34.424431 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8931b7d_39d5_4912_9c99_7c4005368c7c.slice/crio-d5404d77204ba6f1e9a3d5a5bfcb7b081ca5cf4f32aea4618633ca97f24d7bb2 WatchSource:0}: Error finding container d5404d77204ba6f1e9a3d5a5bfcb7b081ca5cf4f32aea4618633ca97f24d7bb2: Status 404 returned error can't find the container with id d5404d77204ba6f1e9a3d5a5bfcb7b081ca5cf4f32aea4618633ca97f24d7bb2 Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.438442 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.445032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.577123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" event={"ID":"d8931b7d-39d5-4912-9c99-7c4005368c7c","Type":"ContainerStarted","Data":"d5404d77204ba6f1e9a3d5a5bfcb7b081ca5cf4f32aea4618633ca97f24d7bb2"} Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.580792 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dlpcl" event={"ID":"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee","Type":"ContainerStarted","Data":"63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7"} Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.587939 4895 generic.go:334] "Generic (PLEG): container finished" podID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerID="ef58461765e9f8b579090cb930f74f94825b7ec2fbeba6897bc7f3647b750142" exitCode=0 Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.588240 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-wf2w2" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.588129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" event={"ID":"5738fec7-4f5b-40e9-82f6-d02d48d7a955","Type":"ContainerDied","Data":"ef58461765e9f8b579090cb930f74f94825b7ec2fbeba6897bc7f3647b750142"} Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.604536 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" event={"ID":"5738fec7-4f5b-40e9-82f6-d02d48d7a955","Type":"ContainerDied","Data":"fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d"} Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.604589 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc45c3883181be9e408aa329f9709dc42800cfda46f5783185b8483724da168d" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.617237 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.668966 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709523 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709616 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg5qr\" (UniqueName: \"kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709781 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709806 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709825 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.709851 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb\") pod \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\" (UID: \"5738fec7-4f5b-40e9-82f6-d02d48d7a955\") " Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.743033 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9nkgt"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.787697 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-57bfc"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.788200 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr" (OuterVolumeSpecName: "kube-api-access-zg5qr") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "kube-api-access-zg5qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.811766 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg5qr\" (UniqueName: \"kubernetes.io/projected/5738fec7-4f5b-40e9-82f6-d02d48d7a955-kube-api-access-zg5qr\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.827323 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.845042 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.858157 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-wf2w2"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.880679 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-clr45"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.887867 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-lh78p"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.896859 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m79dc"] Mar 20 13:41:34 crc kubenswrapper[4895]: I0320 13:41:34.959580 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config" (OuterVolumeSpecName: "config") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.002030 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.002153 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.006119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.009842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5738fec7-4f5b-40e9-82f6-d02d48d7a955" (UID: "5738fec7-4f5b-40e9-82f6-d02d48d7a955"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.018348 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.018383 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.018405 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.018414 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.018422 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5738fec7-4f5b-40e9-82f6-d02d48d7a955-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.093575 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.256096 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2" path="/var/lib/kubelet/pods/b025d246-c7fb-4e5b-b2ba-3ecc468f5eb2/volumes" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.304945 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.408727 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.681238 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerStarted","Data":"b5978181f99adf91dce4b66eca0467fecd30179c379929ee6f35e977684ed44c"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.701871 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-clr45" event={"ID":"c56a9ed0-b52b-42a4-a875-5d383303c91e","Type":"ContainerStarted","Data":"30056c479494251b1287ebb6860e96581c501b758ed149f1960a208fcffa7cda"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.708566 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57bfc" event={"ID":"19fedca4-15c2-4975-807e-e0c9ded7f329","Type":"ContainerStarted","Data":"b324d30244f6903b16e7e36822d27e18242390948e880e84738bde73e3467d99"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.723728 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" event={"ID":"4a1141e6-4801-4733-891c-3e9607c36aca","Type":"ContainerStarted","Data":"0ffadad289dead97ffdf0eb1b5d4918f748a4b17333cb0cf13d7c3822cdd5f94"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.783672 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9nkgt" event={"ID":"2c43c76c-2573-4cce-880d-830e4fd8bed9","Type":"ContainerStarted","Data":"fa35306d58e049177176882caa1719c8f32a63e8debd7da7dec0e87b3b63d162"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.787958 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9nkgt" event={"ID":"2c43c76c-2573-4cce-880d-830e4fd8bed9","Type":"ContainerStarted","Data":"04122d7f4c267cae0cc655756cb6be2ab30fc8f6f1b5ddc0a3a8f95f1cd71a16"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.791290 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.802753 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lh78p" event={"ID":"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c","Type":"ContainerStarted","Data":"c0ff89c98dcdcc42fcf84e9bca3783b909810286ae71fc37d8f7faa1f388120f"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.804948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m79dc" event={"ID":"05ff498d-af75-4603-8dbd-91a429e00cb8","Type":"ContainerStarted","Data":"a70c16f0273ed5ae5e2287408691cd521dbc34bde2dbcbc1d5ee5cced1a5eca1"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.844365 4895 generic.go:334] "Generic (PLEG): container finished" podID="d8931b7d-39d5-4912-9c99-7c4005368c7c" containerID="d75e130865849dd81914467b6ca73f60506f276593d8f8231a385ed6a587d480" exitCode=0 Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.848693 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" event={"ID":"d8931b7d-39d5-4912-9c99-7c4005368c7c","Type":"ContainerDied","Data":"d75e130865849dd81914467b6ca73f60506f276593d8f8231a385ed6a587d480"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.856249 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerStarted","Data":"650de0749f1fe5e365d73598b4b791cbaf63b48cd2b60f4a28c6445cc770c866"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.871305 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9nkgt" podStartSLOduration=2.871275155 podStartE2EDuration="2.871275155s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:35.804114199 +0000 UTC m=+1195.313833165" watchObservedRunningTime="2026-03-20 13:41:35.871275155 +0000 UTC m=+1195.380994121" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.893412 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerStarted","Data":"2c0d221171549fc3a05e8968e98773ed9d9d9053bcafb4ff210b3a1b3050b02f"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.899339 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.912784 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ccpr" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.914459 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.914488 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dlpcl" event={"ID":"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee","Type":"ContainerStarted","Data":"04fc448980f3e27eb78f718bee5920dc121fb9da3801182bbe986bded028d85e"} Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.944656 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dlpcl" podStartSLOduration=3.944641854 podStartE2EDuration="3.944641854s" podCreationTimestamp="2026-03-20 13:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:35.940343295 +0000 UTC m=+1195.450062261" watchObservedRunningTime="2026-03-20 13:41:35.944641854 +0000 UTC m=+1195.454360820" Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.962133 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:35 crc kubenswrapper[4895]: I0320 13:41:35.987078 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ccpr"] Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.572616 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669601 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669682 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669717 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669741 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgbcd\" (UniqueName: \"kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.669831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb\") pod \"d8931b7d-39d5-4912-9c99-7c4005368c7c\" (UID: \"d8931b7d-39d5-4912-9c99-7c4005368c7c\") " Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.704719 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.704949 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd" (OuterVolumeSpecName: "kube-api-access-rgbcd") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "kube-api-access-rgbcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.709674 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.726918 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config" (OuterVolumeSpecName: "config") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.754519 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.757806 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8931b7d-39d5-4912-9c99-7c4005368c7c" (UID: "d8931b7d-39d5-4912-9c99-7c4005368c7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774398 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774429 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774442 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774452 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774460 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgbcd\" (UniqueName: \"kubernetes.io/projected/d8931b7d-39d5-4912-9c99-7c4005368c7c-kube-api-access-rgbcd\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.774469 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8931b7d-39d5-4912-9c99-7c4005368c7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.934677 4895 generic.go:334] "Generic (PLEG): container finished" podID="4a1141e6-4801-4733-891c-3e9607c36aca" containerID="dce7d4ad54b99c0122cf432982a21667f0ad82c14bdbff8ac649104fd4998b95" exitCode=0 Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.935051 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" event={"ID":"4a1141e6-4801-4733-891c-3e9607c36aca","Type":"ContainerDied","Data":"dce7d4ad54b99c0122cf432982a21667f0ad82c14bdbff8ac649104fd4998b95"} Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.946469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerStarted","Data":"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09"} Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.955504 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.955626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-9crs6" event={"ID":"d8931b7d-39d5-4912-9c99-7c4005368c7c","Type":"ContainerDied","Data":"d5404d77204ba6f1e9a3d5a5bfcb7b081ca5cf4f32aea4618633ca97f24d7bb2"} Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.955732 4895 scope.go:117] "RemoveContainer" containerID="d75e130865849dd81914467b6ca73f60506f276593d8f8231a385ed6a587d480" Mar 20 13:41:36 crc kubenswrapper[4895]: I0320 13:41:36.968227 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerStarted","Data":"10ba0e63cade1941d6e51c0c40c52d8889bce7905ccb960c7be7382455b1fe58"} Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.065437 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.096301 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-9crs6"] Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.283746 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" path="/var/lib/kubelet/pods/5738fec7-4f5b-40e9-82f6-d02d48d7a955/volumes" Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.284552 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8931b7d-39d5-4912-9c99-7c4005368c7c" path="/var/lib/kubelet/pods/d8931b7d-39d5-4912-9c99-7c4005368c7c/volumes" Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.980924 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerStarted","Data":"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd"} Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.980997 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-log" containerID="cri-o://0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" gracePeriod=30 Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.984158 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-httpd" containerID="cri-o://b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" gracePeriod=30 Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.989251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" event={"ID":"4a1141e6-4801-4733-891c-3e9607c36aca","Type":"ContainerStarted","Data":"e80caf46d08c27306713cb65c020842ec448f7d45ce4f0c5181e633195a1e1e7"} Mar 20 13:41:37 crc kubenswrapper[4895]: I0320 13:41:37.989636 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.010997 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.01097473 podStartE2EDuration="6.01097473s" podCreationTimestamp="2026-03-20 13:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:37.99958856 +0000 UTC m=+1197.509307526" watchObservedRunningTime="2026-03-20 13:41:38.01097473 +0000 UTC m=+1197.520693696" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.034738 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" podStartSLOduration=5.034720323 podStartE2EDuration="5.034720323s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:38.025078253 +0000 UTC m=+1197.534797219" watchObservedRunningTime="2026-03-20 13:41:38.034720323 +0000 UTC m=+1197.544439289" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.855055 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.948408 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.948619 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.948651 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949272 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs" (OuterVolumeSpecName: "logs") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949348 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949426 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xvqh\" (UniqueName: \"kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.949908 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run\") pod \"92dd66bd-73a6-4326-86fc-2a28d53171b9\" (UID: \"92dd66bd-73a6-4326-86fc-2a28d53171b9\") " Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.950333 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.950716 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.954533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh" (OuterVolumeSpecName: "kube-api-access-7xvqh") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "kube-api-access-7xvqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.977462 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts" (OuterVolumeSpecName: "scripts") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.981869 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2" (OuterVolumeSpecName: "glance") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:41:38 crc kubenswrapper[4895]: I0320 13:41:38.987150 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.011850 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016047 4895 generic.go:334] "Generic (PLEG): container finished" podID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerID="b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" exitCode=143 Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016077 4895 generic.go:334] "Generic (PLEG): container finished" podID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerID="0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" exitCode=143 Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerDied","Data":"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd"} Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016165 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerDied","Data":"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09"} Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016176 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"92dd66bd-73a6-4326-86fc-2a28d53171b9","Type":"ContainerDied","Data":"b5978181f99adf91dce4b66eca0467fecd30179c379929ee6f35e977684ed44c"} Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016216 4895 scope.go:117] "RemoveContainer" containerID="b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.016370 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.022648 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data" (OuterVolumeSpecName: "config-data") pod "92dd66bd-73a6-4326-86fc-2a28d53171b9" (UID: "92dd66bd-73a6-4326-86fc-2a28d53171b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.026982 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerStarted","Data":"9f7f83eb76679943c56c5558fdcbfa456da1a32ecca1ec54a3839f8ef1a47fde"} Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.027078 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-log" containerID="cri-o://10ba0e63cade1941d6e51c0c40c52d8889bce7905ccb960c7be7382455b1fe58" gracePeriod=30 Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.027132 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-httpd" containerID="cri-o://9f7f83eb76679943c56c5558fdcbfa456da1a32ecca1ec54a3839f8ef1a47fde" gracePeriod=30 Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.054504 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.054486255 podStartE2EDuration="7.054486255s" podCreationTimestamp="2026-03-20 13:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:39.049111342 +0000 UTC m=+1198.558830308" watchObservedRunningTime="2026-03-20 13:41:39.054486255 +0000 UTC m=+1198.564205211" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062286 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062317 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92dd66bd-73a6-4326-86fc-2a28d53171b9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062360 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062408 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") on node \"crc\" " Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062421 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062430 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92dd66bd-73a6-4326-86fc-2a28d53171b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.062442 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xvqh\" (UniqueName: \"kubernetes.io/projected/92dd66bd-73a6-4326-86fc-2a28d53171b9-kube-api-access-7xvqh\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.097967 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.098099 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2") on node "crc" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.162571 4895 scope.go:117] "RemoveContainer" containerID="0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.167928 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.302998 4895 scope.go:117] "RemoveContainer" containerID="b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.304639 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd\": container with ID starting with b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd not found: ID does not exist" containerID="b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.304672 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd"} err="failed to get container status \"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd\": rpc error: code = NotFound desc = could not find container \"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd\": container with ID starting with b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd not found: ID does not exist" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.304697 4895 scope.go:117] "RemoveContainer" containerID="0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.305330 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09\": container with ID starting with 0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09 not found: ID does not exist" containerID="0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.305373 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09"} err="failed to get container status \"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09\": rpc error: code = NotFound desc = could not find container \"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09\": container with ID starting with 0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09 not found: ID does not exist" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.305424 4895 scope.go:117] "RemoveContainer" containerID="b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.305842 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd"} err="failed to get container status \"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd\": rpc error: code = NotFound desc = could not find container \"b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd\": container with ID starting with b0e56a2a084f51d3a6b35d1b42cc39d55cb8273a173e2c20aa3d593880ca0fdd not found: ID does not exist" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.305861 4895 scope.go:117] "RemoveContainer" containerID="0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.306767 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09"} err="failed to get container status \"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09\": rpc error: code = NotFound desc = could not find container \"0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09\": container with ID starting with 0bfc90313d357bf72a6bbf3ea00250e562c740f9d5d9c71cd38a2416d00a2b09 not found: ID does not exist" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.360450 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.371829 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.384527 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.384958 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8931b7d-39d5-4912-9c99-7c4005368c7c" containerName="init" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.384979 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8931b7d-39d5-4912-9c99-7c4005368c7c" containerName="init" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.384990 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-log" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.384998 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-log" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.385012 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="dnsmasq-dns" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385019 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="dnsmasq-dns" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.385043 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="init" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385050 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="init" Mar 20 13:41:39 crc kubenswrapper[4895]: E0320 13:41:39.385078 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-httpd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385088 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-httpd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385474 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-log" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385493 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5738fec7-4f5b-40e9-82f6-d02d48d7a955" containerName="dnsmasq-dns" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385516 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8931b7d-39d5-4912-9c99-7c4005368c7c" containerName="init" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.385583 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" containerName="glance-httpd" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.388105 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.390627 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.390889 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.400007 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476811 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476854 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476906 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476943 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476962 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.476995 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.477030 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w7bg\" (UniqueName: \"kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.578988 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579049 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579810 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579836 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w7bg\" (UniqueName: \"kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579759 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.579971 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.580122 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.583118 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.583159 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b6443358e93f73fc16f0714d0b2e759c539b0d4c74a7e83912ba5d1f8dded95/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.585845 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.585891 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.588966 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.589520 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.599978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w7bg\" (UniqueName: \"kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.643631 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " pod="openstack/glance-default-external-api-0" Mar 20 13:41:39 crc kubenswrapper[4895]: I0320 13:41:39.734498 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.039981 4895 generic.go:334] "Generic (PLEG): container finished" podID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerID="9f7f83eb76679943c56c5558fdcbfa456da1a32ecca1ec54a3839f8ef1a47fde" exitCode=0 Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.040237 4895 generic.go:334] "Generic (PLEG): container finished" podID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerID="10ba0e63cade1941d6e51c0c40c52d8889bce7905ccb960c7be7382455b1fe58" exitCode=143 Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.040270 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerDied","Data":"9f7f83eb76679943c56c5558fdcbfa456da1a32ecca1ec54a3839f8ef1a47fde"} Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.040292 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerDied","Data":"10ba0e63cade1941d6e51c0c40c52d8889bce7905ccb960c7be7382455b1fe58"} Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.042514 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" containerID="04fc448980f3e27eb78f718bee5920dc121fb9da3801182bbe986bded028d85e" exitCode=0 Mar 20 13:41:40 crc kubenswrapper[4895]: I0320 13:41:40.042558 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dlpcl" event={"ID":"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee","Type":"ContainerDied","Data":"04fc448980f3e27eb78f718bee5920dc121fb9da3801182bbe986bded028d85e"} Mar 20 13:41:41 crc kubenswrapper[4895]: I0320 13:41:41.228957 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dd66bd-73a6-4326-86fc-2a28d53171b9" path="/var/lib/kubelet/pods/92dd66bd-73a6-4326-86fc-2a28d53171b9/volumes" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.070064 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7405c05-f318-4636-b1e9-be1b4c208a51","Type":"ContainerDied","Data":"650de0749f1fe5e365d73598b4b791cbaf63b48cd2b60f4a28c6445cc770c866"} Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.070324 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="650de0749f1fe5e365d73598b4b791cbaf63b48cd2b60f4a28c6445cc770c866" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.073704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dlpcl" event={"ID":"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee","Type":"ContainerDied","Data":"63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7"} Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.073741 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a8a36d3886732a2b0495f610228e4c926c9f979265b36121e3378a3e244bc7" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.083220 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.089349 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.238941 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.238993 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239061 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239090 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239158 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239199 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gcwd\" (UniqueName: \"kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239217 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239261 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239280 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rfhs\" (UniqueName: \"kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs\") pod \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\" (UID: \"ef48bf1d-6c18-420b-a56f-c59e06e5a2ee\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239318 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239422 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.239547 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"e7405c05-f318-4636-b1e9-be1b4c208a51\" (UID: \"e7405c05-f318-4636-b1e9-be1b4c208a51\") " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.240402 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.240532 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs" (OuterVolumeSpecName: "logs") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.245640 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd" (OuterVolumeSpecName: "kube-api-access-7gcwd") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "kube-api-access-7gcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.245961 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts" (OuterVolumeSpecName: "scripts") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.251695 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs" (OuterVolumeSpecName: "kube-api-access-6rfhs") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "kube-api-access-6rfhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.258814 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.262739 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts" (OuterVolumeSpecName: "scripts") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.263726 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57" (OuterVolumeSpecName: "glance") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.271460 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.278467 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.281830 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data" (OuterVolumeSpecName: "config-data") pod "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" (UID: "ef48bf1d-6c18-420b-a56f-c59e06e5a2ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.282715 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.307616 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.314588 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data" (OuterVolumeSpecName: "config-data") pod "e7405c05-f318-4636-b1e9-be1b4c208a51" (UID: "e7405c05-f318-4636-b1e9-be1b4c208a51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.341905 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.341947 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gcwd\" (UniqueName: \"kubernetes.io/projected/e7405c05-f318-4636-b1e9-be1b4c208a51-kube-api-access-7gcwd\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.341963 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.341976 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.341989 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rfhs\" (UniqueName: \"kubernetes.io/projected/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-kube-api-access-6rfhs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342000 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342010 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342049 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") on node \"crc\" " Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342064 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342075 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342087 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342097 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342108 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7405c05-f318-4636-b1e9-be1b4c208a51-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.342123 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7405c05-f318-4636-b1e9-be1b4c208a51-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.367776 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.367943 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57") on node "crc" Mar 20 13:41:42 crc kubenswrapper[4895]: I0320 13:41:42.443887 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.084502 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.084528 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dlpcl" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.127688 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.155425 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.181639 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:43 crc kubenswrapper[4895]: E0320 13:41:43.181975 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-log" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.181986 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-log" Mar 20 13:41:43 crc kubenswrapper[4895]: E0320 13:41:43.182001 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-httpd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.182007 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-httpd" Mar 20 13:41:43 crc kubenswrapper[4895]: E0320 13:41:43.182016 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" containerName="keystone-bootstrap" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.182022 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" containerName="keystone-bootstrap" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.182192 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" containerName="keystone-bootstrap" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.182204 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-log" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.182220 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" containerName="glance-httpd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.183093 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.183162 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.187193 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.189512 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.235383 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7405c05-f318-4636-b1e9-be1b4c208a51" path="/var/lib/kubelet/pods/e7405c05-f318-4636-b1e9-be1b4c208a51/volumes" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263448 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263547 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263603 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.263682 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szfk\" (UniqueName: \"kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.290675 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dlpcl"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.298372 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dlpcl"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.366359 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.366669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.366809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.366884 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szfk\" (UniqueName: \"kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367224 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367291 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367302 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367398 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.367855 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.376261 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.377329 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hnpjd"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.377895 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.379383 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.380366 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.380803 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.385496 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.385808 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.385989 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.386208 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cfzps" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.399172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szfk\" (UniqueName: \"kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.427574 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hnpjd"] Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471550 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471628 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471651 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhz2\" (UniqueName: \"kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471696 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.471716 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.490029 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.490072 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/18ebb825098e65852293f2b0f63099f5113b6726c6c2675c80c59a63de5999b9/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.546541 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574166 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574199 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhz2\" (UniqueName: \"kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574218 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.574239 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.578524 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.580095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.581524 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.581798 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.581997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.590877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhz2\" (UniqueName: \"kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2\") pod \"keystone-bootstrap-hnpjd\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.828856 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:41:43 crc kubenswrapper[4895]: I0320 13:41:43.844641 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:41:44 crc kubenswrapper[4895]: I0320 13:41:44.285212 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:41:44 crc kubenswrapper[4895]: I0320 13:41:44.341284 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:41:44 crc kubenswrapper[4895]: I0320 13:41:44.341533 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" containerID="cri-o://b9bc36f0c78e3717e6cbb07a5b2db29a25fb3162fe8a2b63a795f620f2058f5d" gracePeriod=10 Mar 20 13:41:45 crc kubenswrapper[4895]: I0320 13:41:45.107943 4895 generic.go:334] "Generic (PLEG): container finished" podID="7484f3c6-ea94-407e-a221-0386705a5caa" containerID="b9bc36f0c78e3717e6cbb07a5b2db29a25fb3162fe8a2b63a795f620f2058f5d" exitCode=0 Mar 20 13:41:45 crc kubenswrapper[4895]: I0320 13:41:45.108002 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" event={"ID":"7484f3c6-ea94-407e-a221-0386705a5caa","Type":"ContainerDied","Data":"b9bc36f0c78e3717e6cbb07a5b2db29a25fb3162fe8a2b63a795f620f2058f5d"} Mar 20 13:41:45 crc kubenswrapper[4895]: I0320 13:41:45.222491 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef48bf1d-6c18-420b-a56f-c59e06e5a2ee" path="/var/lib/kubelet/pods/ef48bf1d-6c18-420b-a56f-c59e06e5a2ee/volumes" Mar 20 13:41:45 crc kubenswrapper[4895]: I0320 13:41:45.961358 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 20 13:41:49 crc kubenswrapper[4895]: E0320 13:41:49.328316 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:41:49 crc kubenswrapper[4895]: E0320 13:41:49.328924 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvnb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-clr45_openstack(c56a9ed0-b52b-42a4-a875-5d383303c91e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:41:49 crc kubenswrapper[4895]: E0320 13:41:49.330260 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-clr45" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" Mar 20 13:41:50 crc kubenswrapper[4895]: E0320 13:41:50.151790 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-clr45" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" Mar 20 13:41:50 crc kubenswrapper[4895]: I0320 13:41:50.960772 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 20 13:41:52 crc kubenswrapper[4895]: I0320 13:41:52.297455 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:52 crc kubenswrapper[4895]: I0320 13:41:52.297524 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:52 crc kubenswrapper[4895]: I0320 13:41:52.297566 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:41:52 crc kubenswrapper[4895]: I0320 13:41:52.298357 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:41:52 crc kubenswrapper[4895]: I0320 13:41:52.298461 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5" gracePeriod=600 Mar 20 13:41:53 crc kubenswrapper[4895]: I0320 13:41:53.178620 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5" exitCode=0 Mar 20 13:41:53 crc kubenswrapper[4895]: I0320 13:41:53.178813 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5"} Mar 20 13:41:53 crc kubenswrapper[4895]: I0320 13:41:53.178929 4895 scope.go:117] "RemoveContainer" containerID="7665a62459ae1c7b18f9301e4a45266b3aa3e993a41f7a98be3e1daf3d48e4a6" Mar 20 13:41:54 crc kubenswrapper[4895]: I0320 13:41:54.191297 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c43c76c-2573-4cce-880d-830e4fd8bed9" containerID="fa35306d58e049177176882caa1719c8f32a63e8debd7da7dec0e87b3b63d162" exitCode=0 Mar 20 13:41:54 crc kubenswrapper[4895]: I0320 13:41:54.191431 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9nkgt" event={"ID":"2c43c76c-2573-4cce-880d-830e4fd8bed9","Type":"ContainerDied","Data":"fa35306d58e049177176882caa1719c8f32a63e8debd7da7dec0e87b3b63d162"} Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.164364 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-98njk"] Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.166975 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.170450 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.170735 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.174752 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.206448 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-98njk"] Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.336756 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g2x\" (UniqueName: \"kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x\") pod \"auto-csr-approver-29566902-98njk\" (UID: \"6a9eaaed-76a2-47d6-9329-bc5b9bf34807\") " pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.438250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g2x\" (UniqueName: \"kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x\") pod \"auto-csr-approver-29566902-98njk\" (UID: \"6a9eaaed-76a2-47d6-9329-bc5b9bf34807\") " pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.458339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g2x\" (UniqueName: \"kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x\") pod \"auto-csr-approver-29566902-98njk\" (UID: \"6a9eaaed-76a2-47d6-9329-bc5b9bf34807\") " pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.497508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.962090 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: i/o timeout" Mar 20 13:42:00 crc kubenswrapper[4895]: I0320 13:42:00.962488 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:42:03 crc kubenswrapper[4895]: E0320 13:42:03.767969 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:42:03 crc kubenswrapper[4895]: E0320 13:42:03.768535 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dt7gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-57bfc_openstack(19fedca4-15c2-4975-807e-e0c9ded7f329): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:03 crc kubenswrapper[4895]: E0320 13:42:03.772077 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-57bfc" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" Mar 20 13:42:03 crc kubenswrapper[4895]: I0320 13:42:03.832895 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:42:03 crc kubenswrapper[4895]: I0320 13:42:03.850188 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008017 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle\") pod \"2c43c76c-2573-4cce-880d-830e4fd8bed9\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008190 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb\") pod \"7484f3c6-ea94-407e-a221-0386705a5caa\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008248 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config\") pod \"7484f3c6-ea94-407e-a221-0386705a5caa\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config\") pod \"2c43c76c-2573-4cce-880d-830e4fd8bed9\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008311 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb\") pod \"7484f3c6-ea94-407e-a221-0386705a5caa\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc\") pod \"7484f3c6-ea94-407e-a221-0386705a5caa\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8kk\" (UniqueName: \"kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk\") pod \"2c43c76c-2573-4cce-880d-830e4fd8bed9\" (UID: \"2c43c76c-2573-4cce-880d-830e4fd8bed9\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.008428 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx826\" (UniqueName: \"kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826\") pod \"7484f3c6-ea94-407e-a221-0386705a5caa\" (UID: \"7484f3c6-ea94-407e-a221-0386705a5caa\") " Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.012723 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826" (OuterVolumeSpecName: "kube-api-access-vx826") pod "7484f3c6-ea94-407e-a221-0386705a5caa" (UID: "7484f3c6-ea94-407e-a221-0386705a5caa"). InnerVolumeSpecName "kube-api-access-vx826". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.012775 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk" (OuterVolumeSpecName: "kube-api-access-8x8kk") pod "2c43c76c-2573-4cce-880d-830e4fd8bed9" (UID: "2c43c76c-2573-4cce-880d-830e4fd8bed9"). InnerVolumeSpecName "kube-api-access-8x8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.035161 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config" (OuterVolumeSpecName: "config") pod "2c43c76c-2573-4cce-880d-830e4fd8bed9" (UID: "2c43c76c-2573-4cce-880d-830e4fd8bed9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.038558 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c43c76c-2573-4cce-880d-830e4fd8bed9" (UID: "2c43c76c-2573-4cce-880d-830e4fd8bed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.057053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7484f3c6-ea94-407e-a221-0386705a5caa" (UID: "7484f3c6-ea94-407e-a221-0386705a5caa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.058592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7484f3c6-ea94-407e-a221-0386705a5caa" (UID: "7484f3c6-ea94-407e-a221-0386705a5caa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.061832 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7484f3c6-ea94-407e-a221-0386705a5caa" (UID: "7484f3c6-ea94-407e-a221-0386705a5caa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.064540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config" (OuterVolumeSpecName: "config") pod "7484f3c6-ea94-407e-a221-0386705a5caa" (UID: "7484f3c6-ea94-407e-a221-0386705a5caa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110471 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110509 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110522 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110533 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c43c76c-2573-4cce-880d-830e4fd8bed9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110544 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110555 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7484f3c6-ea94-407e-a221-0386705a5caa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110566 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8kk\" (UniqueName: \"kubernetes.io/projected/2c43c76c-2573-4cce-880d-830e4fd8bed9-kube-api-access-8x8kk\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.110580 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx826\" (UniqueName: \"kubernetes.io/projected/7484f3c6-ea94-407e-a221-0386705a5caa-kube-api-access-vx826\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.262121 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.310800 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" event={"ID":"7484f3c6-ea94-407e-a221-0386705a5caa","Type":"ContainerDied","Data":"e9e896cc6577a4fa89f9a8b90e94f661023412d333d5543c9d452db662e7d3ab"} Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.310819 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.313815 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9nkgt" Mar 20 13:42:04 crc kubenswrapper[4895]: E0320 13:42:04.321304 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-57bfc" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.321453 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9nkgt" event={"ID":"2c43c76c-2573-4cce-880d-830e4fd8bed9","Type":"ContainerDied","Data":"04122d7f4c267cae0cc655756cb6be2ab30fc8f6f1b5ddc0a3a8f95f1cd71a16"} Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.321481 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04122d7f4c267cae0cc655756cb6be2ab30fc8f6f1b5ddc0a3a8f95f1cd71a16" Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.365493 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:42:04 crc kubenswrapper[4895]: I0320 13:42:04.376740 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9kqz5"] Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.195690 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:05 crc kubenswrapper[4895]: E0320 13:42:05.196354 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c43c76c-2573-4cce-880d-830e4fd8bed9" containerName="neutron-db-sync" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.196369 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c43c76c-2573-4cce-880d-830e4fd8bed9" containerName="neutron-db-sync" Mar 20 13:42:05 crc kubenswrapper[4895]: E0320 13:42:05.196385 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="init" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.196412 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="init" Mar 20 13:42:05 crc kubenswrapper[4895]: E0320 13:42:05.196421 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.196429 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.196728 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.196747 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c43c76c-2573-4cce-880d-830e4fd8bed9" containerName="neutron-db-sync" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.198849 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254349 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254494 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254584 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254608 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254634 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mts\" (UniqueName: \"kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.254783 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" path="/var/lib/kubelet/pods/7484f3c6-ea94-407e-a221-0386705a5caa/volumes" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.255702 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.255734 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.258180 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.258283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.262713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.269451 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.269639 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ngd5w" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.269773 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.356902 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.356962 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357009 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357032 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddq5\" (UniqueName: \"kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357066 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357135 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357159 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357182 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357205 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357240 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mts\" (UniqueName: \"kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.357268 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.358009 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.358360 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.358779 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.359350 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.359960 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.380125 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mts\" (UniqueName: \"kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts\") pod \"dnsmasq-dns-6b7b667979-bxj4g\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.459176 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.459227 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.459274 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.459320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.459361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fddq5\" (UniqueName: \"kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.464668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.465641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.466521 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.467573 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.476443 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddq5\" (UniqueName: \"kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5\") pod \"neutron-6547c6468-fs8ld\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.544596 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.580994 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:05 crc kubenswrapper[4895]: I0320 13:42:05.963188 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-9kqz5" podUID="7484f3c6-ea94-407e-a221-0386705a5caa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: i/o timeout" Mar 20 13:42:06 crc kubenswrapper[4895]: W0320 13:42:06.966929 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a17c32e_3090_49b2_ac2a_91572b5eab39.slice/crio-d7d4b08fcb85851a249860cee28e8e9881e276a6f79ebc5b9a7500dce79e1829 WatchSource:0}: Error finding container d7d4b08fcb85851a249860cee28e8e9881e276a6f79ebc5b9a7500dce79e1829: Status 404 returned error can't find the container with id d7d4b08fcb85851a249860cee28e8e9881e276a6f79ebc5b9a7500dce79e1829 Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.344599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerStarted","Data":"d7d4b08fcb85851a249860cee28e8e9881e276a6f79ebc5b9a7500dce79e1829"} Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.440001 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.441943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.446763 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.447003 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.514804 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.596859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.596926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.596948 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.596991 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.597014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.597031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.597071 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxlx\" (UniqueName: \"kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.602693 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hnpjd"] Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.612836 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699191 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699268 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699306 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699363 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699421 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699448 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.699505 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxlx\" (UniqueName: \"kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.707099 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.709978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.710660 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.715539 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.721705 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.722173 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxlx\" (UniqueName: \"kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.725739 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs\") pod \"neutron-6b745c9b4c-q9rhb\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:07 crc kubenswrapper[4895]: I0320 13:42:07.776302 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:10 crc kubenswrapper[4895]: I0320 13:42:10.628806 4895 scope.go:117] "RemoveContainer" containerID="b9bc36f0c78e3717e6cbb07a5b2db29a25fb3162fe8a2b63a795f620f2058f5d" Mar 20 13:42:10 crc kubenswrapper[4895]: W0320 13:42:10.647540 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b9f3d1_2bfd_43a4_ba9e_2f5008d865d7.slice/crio-d895dc483020f93f1aa4747ce070728e1f6e4e4b101977a8420380cf7d26f51f WatchSource:0}: Error finding container d895dc483020f93f1aa4747ce070728e1f6e4e4b101977a8420380cf7d26f51f: Status 404 returned error can't find the container with id d895dc483020f93f1aa4747ce070728e1f6e4e4b101977a8420380cf7d26f51f Mar 20 13:42:10 crc kubenswrapper[4895]: E0320 13:42:10.656837 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 20 13:42:10 crc kubenswrapper[4895]: E0320 13:42:10.656889 4895 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 20 13:42:10 crc kubenswrapper[4895]: E0320 13:42:10.657076 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tz2bl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-lh78p_openstack(8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:10 crc kubenswrapper[4895]: E0320 13:42:10.658288 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-lh78p" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" Mar 20 13:42:10 crc kubenswrapper[4895]: I0320 13:42:10.893963 4895 scope.go:117] "RemoveContainer" containerID="144c81ee2b61e5cecf8e2e43c64e4a23f44336604f9cbe1761503cd39fca1fdd" Mar 20 13:42:10 crc kubenswrapper[4895]: E0320 13:42:10.905914 4895 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/dnsmasq-dns-b8fbc5445-9kqz5_openstack_init-144c81ee2b61e5cecf8e2e43c64e4a23f44336604f9cbe1761503cd39fca1fdd.log: no such file or directory" path="/var/log/containers/dnsmasq-dns-b8fbc5445-9kqz5_openstack_init-144c81ee2b61e5cecf8e2e43c64e4a23f44336604f9cbe1761503cd39fca1fdd.log" Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.185201 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-98njk"] Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.429278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerStarted","Data":"d895dc483020f93f1aa4747ce070728e1f6e4e4b101977a8420380cf7d26f51f"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.445250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-clr45" event={"ID":"c56a9ed0-b52b-42a4-a875-5d383303c91e","Type":"ContainerStarted","Data":"928b408aa9ce905eb0fdab7c5bd8f5c9ca76daf917c55d5268da6de23f94d10e"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.448250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerStarted","Data":"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.450083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-98njk" event={"ID":"6a9eaaed-76a2-47d6-9329-bc5b9bf34807","Type":"ContainerStarted","Data":"fe0dd3a0474e62b4e2ecea75b6d9d48a63203dee77c20c5364d10ca886dfd181"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.478360 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.486812 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-clr45" podStartSLOduration=2.612671909 podStartE2EDuration="38.486792924s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="2026-03-20 13:41:34.857295726 +0000 UTC m=+1194.367014682" lastFinishedPulling="2026-03-20 13:42:10.731416731 +0000 UTC m=+1230.241135697" observedRunningTime="2026-03-20 13:42:11.460666856 +0000 UTC m=+1230.970385822" watchObservedRunningTime="2026-03-20 13:42:11.486792924 +0000 UTC m=+1230.996511890" Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.496761 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.504993 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnpjd" event={"ID":"209c54a2-1964-481c-80e4-16eaef611f4e","Type":"ContainerStarted","Data":"b5d8f9cfc2e261233c9bfc1cef442b5b2237ecd1593504dbe4c3a97cb2eae86e"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.505214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnpjd" event={"ID":"209c54a2-1964-481c-80e4-16eaef611f4e","Type":"ContainerStarted","Data":"8bb638970c1be6142669764dcbd16631c60bc1504b565f35a44367059449b6ca"} Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.507975 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m79dc" event={"ID":"05ff498d-af75-4603-8dbd-91a429e00cb8","Type":"ContainerStarted","Data":"7700f88e906f3ddddc252e5432352b1ecba48f2f2d6ad50e676b382bfd2848d8"} Mar 20 13:42:11 crc kubenswrapper[4895]: E0320 13:42:11.510532 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-lh78p" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.538166 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:11 crc kubenswrapper[4895]: W0320 13:42:11.538957 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda31897ee_2067_4a4d_9ecd_c9ed35777f92.slice/crio-e174a1beace5c4093b2b410210b3ee303b9f6a45e8cab8edc1c3d33aeb75cf4c WatchSource:0}: Error finding container e174a1beace5c4093b2b410210b3ee303b9f6a45e8cab8edc1c3d33aeb75cf4c: Status 404 returned error can't find the container with id e174a1beace5c4093b2b410210b3ee303b9f6a45e8cab8edc1c3d33aeb75cf4c Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.557065 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hnpjd" podStartSLOduration=28.557044311 podStartE2EDuration="28.557044311s" podCreationTimestamp="2026-03-20 13:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:11.536775027 +0000 UTC m=+1231.046494003" watchObservedRunningTime="2026-03-20 13:42:11.557044311 +0000 UTC m=+1231.066763277" Mar 20 13:42:11 crc kubenswrapper[4895]: I0320 13:42:11.568922 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-m79dc" podStartSLOduration=9.830085528 podStartE2EDuration="38.568900062s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="2026-03-20 13:41:34.971502959 +0000 UTC m=+1194.481221925" lastFinishedPulling="2026-03-20 13:42:03.710317483 +0000 UTC m=+1223.220036459" observedRunningTime="2026-03-20 13:42:11.553340866 +0000 UTC m=+1231.063059852" watchObservedRunningTime="2026-03-20 13:42:11.568900062 +0000 UTC m=+1231.078619028" Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.059511 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:42:12 crc kubenswrapper[4895]: W0320 13:42:12.086028 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402d2058_787a_48d5_afb4_7f54fbf42121.slice/crio-f5b65a01fd153654df673706441529a9b4bc54793b01cb1a00cb7d4a41dacf44 WatchSource:0}: Error finding container f5b65a01fd153654df673706441529a9b4bc54793b01cb1a00cb7d4a41dacf44: Status 404 returned error can't find the container with id f5b65a01fd153654df673706441529a9b4bc54793b01cb1a00cb7d4a41dacf44 Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.519535 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerStarted","Data":"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.522736 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerStarted","Data":"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.522780 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerStarted","Data":"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.522792 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerStarted","Data":"5db68a92f4379b74cab069a9a785d2f8b2c33afc2c6ebf023929b0708f811fc2"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.523101 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.532666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerStarted","Data":"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.534663 4895 generic.go:334] "Generic (PLEG): container finished" podID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerID="265358b81e0e820dbb2aa1dc39d10ea13e0b7e36035b4cec890973eaf7caa307" exitCode=0 Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.534758 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" event={"ID":"a31897ee-2067-4a4d-9ecd-c9ed35777f92","Type":"ContainerDied","Data":"265358b81e0e820dbb2aa1dc39d10ea13e0b7e36035b4cec890973eaf7caa307"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.534804 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" event={"ID":"a31897ee-2067-4a4d-9ecd-c9ed35777f92","Type":"ContainerStarted","Data":"e174a1beace5c4093b2b410210b3ee303b9f6a45e8cab8edc1c3d33aeb75cf4c"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.538720 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerStarted","Data":"5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.538793 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerStarted","Data":"f5b65a01fd153654df673706441529a9b4bc54793b01cb1a00cb7d4a41dacf44"} Mar 20 13:42:12 crc kubenswrapper[4895]: I0320 13:42:12.549182 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b745c9b4c-q9rhb" podStartSLOduration=5.5491645080000005 podStartE2EDuration="5.549164508s" podCreationTimestamp="2026-03-20 13:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:12.548848341 +0000 UTC m=+1232.058567297" watchObservedRunningTime="2026-03-20 13:42:12.549164508 +0000 UTC m=+1232.058883474" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.548195 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerStarted","Data":"cd6786ff3e657bd26bbbf8b98f58c9cf972897524a74bdbaa3a483ccaa8af59e"} Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.548657 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.550260 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerStarted","Data":"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2"} Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.563886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-98njk" event={"ID":"6a9eaaed-76a2-47d6-9329-bc5b9bf34807","Type":"ContainerStarted","Data":"59cca7b17788ca8ea30634268b6974726d95253e7da3534f6c7823783cfd65d0"} Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.568826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerStarted","Data":"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed"} Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.600372 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6547c6468-fs8ld" podStartSLOduration=8.600348558 podStartE2EDuration="8.600348558s" podCreationTimestamp="2026-03-20 13:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:13.58077456 +0000 UTC m=+1233.090493546" watchObservedRunningTime="2026-03-20 13:42:13.600348558 +0000 UTC m=+1233.110067524" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.613228 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566902-98njk" podStartSLOduration=12.517741589 podStartE2EDuration="13.613215032s" podCreationTimestamp="2026-03-20 13:42:00 +0000 UTC" firstStartedPulling="2026-03-20 13:42:11.202681363 +0000 UTC m=+1230.712400329" lastFinishedPulling="2026-03-20 13:42:12.298154806 +0000 UTC m=+1231.807873772" observedRunningTime="2026-03-20 13:42:13.611840451 +0000 UTC m=+1233.121559417" watchObservedRunningTime="2026-03-20 13:42:13.613215032 +0000 UTC m=+1233.122933998" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.631960 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.631939052 podStartE2EDuration="30.631939052s" podCreationTimestamp="2026-03-20 13:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:13.627556221 +0000 UTC m=+1233.137275197" watchObservedRunningTime="2026-03-20 13:42:13.631939052 +0000 UTC m=+1233.141658018" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.652071 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=34.652054431 podStartE2EDuration="34.652054431s" podCreationTimestamp="2026-03-20 13:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:13.646717889 +0000 UTC m=+1233.156436855" watchObservedRunningTime="2026-03-20 13:42:13.652054431 +0000 UTC m=+1233.161773397" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.845084 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.845198 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.845232 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.845246 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.898455 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:13 crc kubenswrapper[4895]: I0320 13:42:13.900154 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.579788 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" event={"ID":"a31897ee-2067-4a4d-9ecd-c9ed35777f92","Type":"ContainerStarted","Data":"96da8cbac28b924a7c6a47f234535444b5c297e1b10f8fa93a9e21c08f94d390"} Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.580111 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.582536 4895 generic.go:334] "Generic (PLEG): container finished" podID="05ff498d-af75-4603-8dbd-91a429e00cb8" containerID="7700f88e906f3ddddc252e5432352b1ecba48f2f2d6ad50e676b382bfd2848d8" exitCode=0 Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.582621 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m79dc" event={"ID":"05ff498d-af75-4603-8dbd-91a429e00cb8","Type":"ContainerDied","Data":"7700f88e906f3ddddc252e5432352b1ecba48f2f2d6ad50e676b382bfd2848d8"} Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.585023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerStarted","Data":"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e"} Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.593005 4895 generic.go:334] "Generic (PLEG): container finished" podID="6a9eaaed-76a2-47d6-9329-bc5b9bf34807" containerID="59cca7b17788ca8ea30634268b6974726d95253e7da3534f6c7823783cfd65d0" exitCode=0 Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.594072 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-98njk" event={"ID":"6a9eaaed-76a2-47d6-9329-bc5b9bf34807","Type":"ContainerDied","Data":"59cca7b17788ca8ea30634268b6974726d95253e7da3534f6c7823783cfd65d0"} Mar 20 13:42:14 crc kubenswrapper[4895]: I0320 13:42:14.621960 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" podStartSLOduration=9.621930242 podStartE2EDuration="9.621930242s" podCreationTimestamp="2026-03-20 13:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:14.604625345 +0000 UTC m=+1234.114344341" watchObservedRunningTime="2026-03-20 13:42:14.621930242 +0000 UTC m=+1234.131649208" Mar 20 13:42:15 crc kubenswrapper[4895]: I0320 13:42:15.606530 4895 generic.go:334] "Generic (PLEG): container finished" podID="209c54a2-1964-481c-80e4-16eaef611f4e" containerID="b5d8f9cfc2e261233c9bfc1cef442b5b2237ecd1593504dbe4c3a97cb2eae86e" exitCode=0 Mar 20 13:42:15 crc kubenswrapper[4895]: I0320 13:42:15.606798 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnpjd" event={"ID":"209c54a2-1964-481c-80e4-16eaef611f4e","Type":"ContainerDied","Data":"b5d8f9cfc2e261233c9bfc1cef442b5b2237ecd1593504dbe4c3a97cb2eae86e"} Mar 20 13:42:15 crc kubenswrapper[4895]: I0320 13:42:15.610455 4895 generic.go:334] "Generic (PLEG): container finished" podID="c56a9ed0-b52b-42a4-a875-5d383303c91e" containerID="928b408aa9ce905eb0fdab7c5bd8f5c9ca76daf917c55d5268da6de23f94d10e" exitCode=0 Mar 20 13:42:15 crc kubenswrapper[4895]: I0320 13:42:15.611492 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-clr45" event={"ID":"c56a9ed0-b52b-42a4-a875-5d383303c91e","Type":"ContainerDied","Data":"928b408aa9ce905eb0fdab7c5bd8f5c9ca76daf917c55d5268da6de23f94d10e"} Mar 20 13:42:19 crc kubenswrapper[4895]: I0320 13:42:19.737696 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:42:19 crc kubenswrapper[4895]: I0320 13:42:19.738328 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:42:19 crc kubenswrapper[4895]: I0320 13:42:19.776611 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:42:19 crc kubenswrapper[4895]: I0320 13:42:19.816460 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:42:20 crc kubenswrapper[4895]: I0320 13:42:20.545602 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:20 crc kubenswrapper[4895]: I0320 13:42:20.616130 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:42:20 crc kubenswrapper[4895]: I0320 13:42:20.616410 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="dnsmasq-dns" containerID="cri-o://e80caf46d08c27306713cb65c020842ec448f7d45ce4f0c5181e633195a1e1e7" gracePeriod=10 Mar 20 13:42:20 crc kubenswrapper[4895]: I0320 13:42:20.673823 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:42:20 crc kubenswrapper[4895]: I0320 13:42:20.673863 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:42:21 crc kubenswrapper[4895]: I0320 13:42:21.682241 4895 generic.go:334] "Generic (PLEG): container finished" podID="4a1141e6-4801-4733-891c-3e9607c36aca" containerID="e80caf46d08c27306713cb65c020842ec448f7d45ce4f0c5181e633195a1e1e7" exitCode=0 Mar 20 13:42:21 crc kubenswrapper[4895]: I0320 13:42:21.682327 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" event={"ID":"4a1141e6-4801-4733-891c-3e9607c36aca","Type":"ContainerDied","Data":"e80caf46d08c27306713cb65c020842ec448f7d45ce4f0c5181e633195a1e1e7"} Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.693571 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-clr45" event={"ID":"c56a9ed0-b52b-42a4-a875-5d383303c91e","Type":"ContainerDied","Data":"30056c479494251b1287ebb6860e96581c501b758ed149f1960a208fcffa7cda"} Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.693609 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30056c479494251b1287ebb6860e96581c501b758ed149f1960a208fcffa7cda" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.697055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m79dc" event={"ID":"05ff498d-af75-4603-8dbd-91a429e00cb8","Type":"ContainerDied","Data":"a70c16f0273ed5ae5e2287408691cd521dbc34bde2dbcbc1d5ee5cced1a5eca1"} Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.697118 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70c16f0273ed5ae5e2287408691cd521dbc34bde2dbcbc1d5ee5cced1a5eca1" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.699437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-98njk" event={"ID":"6a9eaaed-76a2-47d6-9329-bc5b9bf34807","Type":"ContainerDied","Data":"fe0dd3a0474e62b4e2ecea75b6d9d48a63203dee77c20c5364d10ca886dfd181"} Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.699468 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0dd3a0474e62b4e2ecea75b6d9d48a63203dee77c20c5364d10ca886dfd181" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.705195 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnpjd" event={"ID":"209c54a2-1964-481c-80e4-16eaef611f4e","Type":"ContainerDied","Data":"8bb638970c1be6142669764dcbd16631c60bc1504b565f35a44367059449b6ca"} Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.705267 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb638970c1be6142669764dcbd16631c60bc1504b565f35a44367059449b6ca" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.705225 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.705325 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.799983 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.807483 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.816752 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m79dc" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.833364 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-clr45" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932738 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle\") pod \"05ff498d-af75-4603-8dbd-91a429e00cb8\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932779 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slhz2\" (UniqueName: \"kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932870 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvnb9\" (UniqueName: \"kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9\") pod \"c56a9ed0-b52b-42a4-a875-5d383303c91e\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932895 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932916 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts\") pod \"05ff498d-af75-4603-8dbd-91a429e00cb8\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.932948 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data\") pod \"c56a9ed0-b52b-42a4-a875-5d383303c91e\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933038 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle\") pod \"c56a9ed0-b52b-42a4-a875-5d383303c91e\" (UID: \"c56a9ed0-b52b-42a4-a875-5d383303c91e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933054 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data\") pod \"05ff498d-af75-4603-8dbd-91a429e00cb8\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs\") pod \"05ff498d-af75-4603-8dbd-91a429e00cb8\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933127 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933159 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5g2x\" (UniqueName: \"kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x\") pod \"6a9eaaed-76a2-47d6-9329-bc5b9bf34807\" (UID: \"6a9eaaed-76a2-47d6-9329-bc5b9bf34807\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933190 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933261 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzzf7\" (UniqueName: \"kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7\") pod \"05ff498d-af75-4603-8dbd-91a429e00cb8\" (UID: \"05ff498d-af75-4603-8dbd-91a429e00cb8\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.933283 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle\") pod \"209c54a2-1964-481c-80e4-16eaef611f4e\" (UID: \"209c54a2-1964-481c-80e4-16eaef611f4e\") " Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.939494 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts" (OuterVolumeSpecName: "scripts") pod "05ff498d-af75-4603-8dbd-91a429e00cb8" (UID: "05ff498d-af75-4603-8dbd-91a429e00cb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.940100 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2" (OuterVolumeSpecName: "kube-api-access-slhz2") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "kube-api-access-slhz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.940183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c56a9ed0-b52b-42a4-a875-5d383303c91e" (UID: "c56a9ed0-b52b-42a4-a875-5d383303c91e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.940249 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.940531 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs" (OuterVolumeSpecName: "logs") pod "05ff498d-af75-4603-8dbd-91a429e00cb8" (UID: "05ff498d-af75-4603-8dbd-91a429e00cb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.941861 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x" (OuterVolumeSpecName: "kube-api-access-m5g2x") pod "6a9eaaed-76a2-47d6-9329-bc5b9bf34807" (UID: "6a9eaaed-76a2-47d6-9329-bc5b9bf34807"). InnerVolumeSpecName "kube-api-access-m5g2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.943038 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7" (OuterVolumeSpecName: "kube-api-access-hzzf7") pod "05ff498d-af75-4603-8dbd-91a429e00cb8" (UID: "05ff498d-af75-4603-8dbd-91a429e00cb8"). InnerVolumeSpecName "kube-api-access-hzzf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.943363 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts" (OuterVolumeSpecName: "scripts") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.945611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9" (OuterVolumeSpecName: "kube-api-access-qvnb9") pod "c56a9ed0-b52b-42a4-a875-5d383303c91e" (UID: "c56a9ed0-b52b-42a4-a875-5d383303c91e"). InnerVolumeSpecName "kube-api-access-qvnb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.957229 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.965917 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c56a9ed0-b52b-42a4-a875-5d383303c91e" (UID: "c56a9ed0-b52b-42a4-a875-5d383303c91e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.965951 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data" (OuterVolumeSpecName: "config-data") pod "05ff498d-af75-4603-8dbd-91a429e00cb8" (UID: "05ff498d-af75-4603-8dbd-91a429e00cb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.966621 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05ff498d-af75-4603-8dbd-91a429e00cb8" (UID: "05ff498d-af75-4603-8dbd-91a429e00cb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.970817 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data" (OuterVolumeSpecName: "config-data") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:22 crc kubenswrapper[4895]: I0320 13:42:22.978562 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "209c54a2-1964-481c-80e4-16eaef611f4e" (UID: "209c54a2-1964-481c-80e4-16eaef611f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.034892 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035027 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slhz2\" (UniqueName: \"kubernetes.io/projected/209c54a2-1964-481c-80e4-16eaef611f4e-kube-api-access-slhz2\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035096 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvnb9\" (UniqueName: \"kubernetes.io/projected/c56a9ed0-b52b-42a4-a875-5d383303c91e-kube-api-access-qvnb9\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035147 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035197 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035256 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035309 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56a9ed0-b52b-42a4-a875-5d383303c91e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035366 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ff498d-af75-4603-8dbd-91a429e00cb8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035460 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05ff498d-af75-4603-8dbd-91a429e00cb8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035523 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035579 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5g2x\" (UniqueName: \"kubernetes.io/projected/6a9eaaed-76a2-47d6-9329-bc5b9bf34807-kube-api-access-m5g2x\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035702 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035758 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035872 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzzf7\" (UniqueName: \"kubernetes.io/projected/05ff498d-af75-4603-8dbd-91a429e00cb8-kube-api-access-hzzf7\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.035991 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209c54a2-1964-481c-80e4-16eaef611f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.731131 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnpjd" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.731176 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m79dc" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.731210 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" event={"ID":"4a1141e6-4801-4733-891c-3e9607c36aca","Type":"ContainerDied","Data":"0ffadad289dead97ffdf0eb1b5d4918f748a4b17333cb0cf13d7c3822cdd5f94"} Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.733509 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ffadad289dead97ffdf0eb1b5d4918f748a4b17333cb0cf13d7c3822cdd5f94" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.731490 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-98njk" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.731437 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-clr45" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.928599 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.955202 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-85vf7"] Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.977258 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-85vf7"] Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.988459 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65558fd5f5-5tmzj"] Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.988908 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="dnsmasq-dns" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.988926 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="dnsmasq-dns" Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.988944 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209c54a2-1964-481c-80e4-16eaef611f4e" containerName="keystone-bootstrap" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.988952 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="209c54a2-1964-481c-80e4-16eaef611f4e" containerName="keystone-bootstrap" Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.988964 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9eaaed-76a2-47d6-9329-bc5b9bf34807" containerName="oc" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.988970 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9eaaed-76a2-47d6-9329-bc5b9bf34807" containerName="oc" Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.988987 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ff498d-af75-4603-8dbd-91a429e00cb8" containerName="placement-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.988993 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ff498d-af75-4603-8dbd-91a429e00cb8" containerName="placement-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.989001 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="init" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989007 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="init" Mar 20 13:42:23 crc kubenswrapper[4895]: E0320 13:42:23.989028 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" containerName="barbican-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989035 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" containerName="barbican-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989203 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="209c54a2-1964-481c-80e4-16eaef611f4e" containerName="keystone-bootstrap" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989223 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ff498d-af75-4603-8dbd-91a429e00cb8" containerName="placement-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989235 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" containerName="barbican-db-sync" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989250 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" containerName="dnsmasq-dns" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989256 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9eaaed-76a2-47d6-9329-bc5b9bf34807" containerName="oc" Mar 20 13:42:23 crc kubenswrapper[4895]: I0320 13:42:23.989969 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.996903 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.997115 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.997158 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.997295 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cfzps" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.997419 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:23.997922 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.033287 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65558fd5f5-5tmzj"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056038 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056277 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngwsx\" (UniqueName: \"kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056333 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc\") pod \"4a1141e6-4801-4733-891c-3e9607c36aca\" (UID: \"4a1141e6-4801-4733-891c-3e9607c36aca\") " Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056792 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-scripts\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.056897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-credential-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-internal-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057061 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-fernet-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057113 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-config-data\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5k7v\" (UniqueName: \"kubernetes.io/projected/54aa85fa-25cb-409a-be60-4c0cb8468466-kube-api-access-d5k7v\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057279 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-combined-ca-bundle\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.057300 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-public-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.082351 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.086948 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.100264 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.102306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx" (OuterVolumeSpecName: "kube-api-access-ngwsx") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "kube-api-access-ngwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.102385 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.102707 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.102942 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z768l" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.103097 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.105619 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.158899 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-scripts\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159002 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159060 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-credential-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-internal-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159198 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-fernet-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159242 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-config-data\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159297 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5k7v\" (UniqueName: \"kubernetes.io/projected/54aa85fa-25cb-409a-be60-4c0cb8468466-kube-api-access-d5k7v\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159428 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159452 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159505 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-combined-ca-bundle\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-public-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159576 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159620 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4v4j\" (UniqueName: \"kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.159758 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngwsx\" (UniqueName: \"kubernetes.io/projected/4a1141e6-4801-4733-891c-3e9607c36aca-kube-api-access-ngwsx\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.250807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-scripts\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.258947 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-credential-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.260958 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261014 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261037 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261076 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261115 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4v4j\" (UniqueName: \"kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261155 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-combined-ca-bundle\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.261193 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.264418 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.264983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-internal-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.266501 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.269576 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-config-data\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.270244 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-fernet-keys\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.270355 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54aa85fa-25cb-409a-be60-4c0cb8468466-public-tls-certs\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.272757 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.276181 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.301588 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5k7v\" (UniqueName: \"kubernetes.io/projected/54aa85fa-25cb-409a-be60-4c0cb8468466-kube-api-access-d5k7v\") pod \"keystone-65558fd5f5-5tmzj\" (UID: \"54aa85fa-25cb-409a-be60-4c0cb8468466\") " pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.301626 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.305919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.329702 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config" (OuterVolumeSpecName: "config") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.334101 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.340353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4v4j\" (UniqueName: \"kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j\") pod \"placement-85d989d55b-spf52\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.367742 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.379065 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.392415 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.394886 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.413800 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.414200 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.414670 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rpmz6" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.448321 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.454277 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.463376 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8rc\" (UniqueName: \"kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475479 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475553 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475629 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475673 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475715 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475726 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475734 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.475911 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.529671 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b99d76fbb-c92mx"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.531269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.587485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-config-data\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.587554 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-internal-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.587576 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-scripts\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.587598 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-combined-ca-bundle\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.587677 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602227 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602258 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8rc\" (UniqueName: \"kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602282 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602305 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-public-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602324 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shh8s\" (UniqueName: \"kubernetes.io/projected/46abdc7f-1f99-44dc-8cc1-3a7c61186946-kube-api-access-shh8s\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46abdc7f-1f99-44dc-8cc1-3a7c61186946-logs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.602534 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.603678 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.606192 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a1141e6-4801-4733-891c-3e9607c36aca" (UID: "4a1141e6-4801-4733-891c-3e9607c36aca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.633621 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.643640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.645313 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.649867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8rc\" (UniqueName: \"kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc\") pod \"barbican-worker-8ccd8f54c-5rvhq\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.666568 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b99d76fbb-c92mx"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722725 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-internal-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722766 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-scripts\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722790 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-combined-ca-bundle\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722883 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-public-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shh8s\" (UniqueName: \"kubernetes.io/projected/46abdc7f-1f99-44dc-8cc1-3a7c61186946-kube-api-access-shh8s\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722946 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46abdc7f-1f99-44dc-8cc1-3a7c61186946-logs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.722990 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-config-data\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.723041 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1141e6-4801-4733-891c-3e9607c36aca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.726524 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.728208 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.739144 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46abdc7f-1f99-44dc-8cc1-3a7c61186946-logs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.741711 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.745182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-combined-ca-bundle\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.746497 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.748225 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-scripts\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.749424 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-public-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.749799 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-config-data\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.769844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46abdc7f-1f99-44dc-8cc1-3a7c61186946-internal-tls-certs\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.799469 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.820935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shh8s\" (UniqueName: \"kubernetes.io/projected/46abdc7f-1f99-44dc-8cc1-3a7c61186946-kube-api-access-shh8s\") pod \"placement-7b99d76fbb-c92mx\" (UID: \"46abdc7f-1f99-44dc-8cc1-3a7c61186946\") " pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.847055 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.847231 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.847290 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.847424 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwtgh\" (UniqueName: \"kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.847444 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.938934 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.971978 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.972391 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwtgh\" (UniqueName: \"kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.972691 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.972971 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.973177 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:24 crc kubenswrapper[4895]: I0320 13:42:24.974544 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.011624 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.016321 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.020450 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.044918 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.052271 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.052368 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.068741 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-5shnc" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.068860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwtgh\" (UniqueName: \"kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh\") pod \"barbican-keystone-listener-77fb447c54-w8vrq\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.069319 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerStarted","Data":"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25"} Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.078968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.079043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.079098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4p7\" (UniqueName: \"kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.079232 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.079268 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.079301 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.127052 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.127159 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.128175 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.167245 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56f588c54c-qdk5g"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.169746 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.183579 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.183638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.183700 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.183748 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.183835 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.184095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4p7\" (UniqueName: \"kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.184972 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.186154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.186783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.186995 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.192583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.210432 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56f588c54c-qdk5g"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.236908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4p7\" (UniqueName: \"kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7\") pod \"dnsmasq-dns-848cf88cfc-wlmw6\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.255317 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4616a449-db7a-40c3-9960-975e92a69030" path="/var/lib/kubelet/pods/4616a449-db7a-40c3-9960-975e92a69030/volumes" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.266358 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b7d445cd4-s2zjm"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.289818 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b7d445cd4-s2zjm"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.290225 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.290296 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.292272 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.293381 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.296101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.342738 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.386704 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.388953 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-combined-ca-bundle\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.389137 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443e18a5-4a5b-4678-8a19-dca8434a8a31-logs\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.389281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data-custom\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.389460 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.389552 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4xz\" (UniqueName: \"kubernetes.io/projected/443e18a5-4a5b-4678-8a19-dca8434a8a31-kube-api-access-8c4xz\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.468541 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.479441 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-5shnc"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9js5\" (UniqueName: \"kubernetes.io/projected/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-kube-api-access-l9js5\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492709 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492740 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492760 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-logs\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492785 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data-custom\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492830 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9l4\" (UniqueName: \"kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492884 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492909 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4xz\" (UniqueName: \"kubernetes.io/projected/443e18a5-4a5b-4678-8a19-dca8434a8a31-kube-api-access-8c4xz\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492929 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data-custom\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492979 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-combined-ca-bundle\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.492998 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.493031 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443e18a5-4a5b-4678-8a19-dca8434a8a31-logs\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.493075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.493089 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.497826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data-custom\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.499761 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443e18a5-4a5b-4678-8a19-dca8434a8a31-logs\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.503683 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-combined-ca-bundle\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.517031 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443e18a5-4a5b-4678-8a19-dca8434a8a31-config-data\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.519079 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4xz\" (UniqueName: \"kubernetes.io/projected/443e18a5-4a5b-4678-8a19-dca8434a8a31-kube-api-access-8c4xz\") pod \"barbican-worker-56f588c54c-qdk5g\" (UID: \"443e18a5-4a5b-4678-8a19-dca8434a8a31\") " pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.544273 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.568028 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56f588c54c-qdk5g" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.593969 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65558fd5f5-5tmzj"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595032 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595116 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595161 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9js5\" (UniqueName: \"kubernetes.io/projected/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-kube-api-access-l9js5\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595209 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595228 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-logs\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595270 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9l4\" (UniqueName: \"kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.595334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data-custom\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.613385 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.613580 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data-custom\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.615449 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.615550 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-combined-ca-bundle\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.616365 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-logs\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.619376 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-config-data\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.620984 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.636423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.642873 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9js5\" (UniqueName: \"kubernetes.io/projected/ebaa89fb-ad42-4038-a2fa-cbc9d2711354-kube-api-access-l9js5\") pod \"barbican-keystone-listener-5b7d445cd4-s2zjm\" (UID: \"ebaa89fb-ad42-4038-a2fa-cbc9d2711354\") " pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.647365 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.647819 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9l4\" (UniqueName: \"kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4\") pod \"barbican-api-5cd7655cd6-lh976\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.648192 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.691704 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:25 crc kubenswrapper[4895]: I0320 13:42:25.882201 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b99d76fbb-c92mx"] Mar 20 13:42:26 crc kubenswrapper[4895]: W0320 13:42:26.092958 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2708316e_74e9_4fcc_948c_02f3c3e712ff.slice/crio-daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076 WatchSource:0}: Error finding container daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076: Status 404 returned error can't find the container with id daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076 Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.094452 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.099520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerStarted","Data":"949dc6cb2ae0ab3721265ac06f55535c1b449a1b61723f0889b56edc8f446e3f"} Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.113538 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65558fd5f5-5tmzj" event={"ID":"54aa85fa-25cb-409a-be60-4c0cb8468466","Type":"ContainerStarted","Data":"c9881caad109b2b51df9798f87985325230587efbbf648cd0712d9865d01eb90"} Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.116919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57bfc" event={"ID":"19fedca4-15c2-4975-807e-e0c9ded7f329","Type":"ContainerStarted","Data":"0a4a6f07e6900137b6e30afe86a56ebdb7db9c6e4fbd5df523646bb2ba158250"} Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.137461 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-57bfc" podStartSLOduration=5.252575847 podStartE2EDuration="54.137443628s" podCreationTimestamp="2026-03-20 13:41:32 +0000 UTC" firstStartedPulling="2026-03-20 13:41:34.820857222 +0000 UTC m=+1194.330576188" lastFinishedPulling="2026-03-20 13:42:23.705725003 +0000 UTC m=+1243.215443969" observedRunningTime="2026-03-20 13:42:26.132646268 +0000 UTC m=+1245.642365234" watchObservedRunningTime="2026-03-20 13:42:26.137443628 +0000 UTC m=+1245.647162594" Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.139329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b99d76fbb-c92mx" event={"ID":"46abdc7f-1f99-44dc-8cc1-3a7c61186946","Type":"ContainerStarted","Data":"73564300c005090da827d780e4384f5da9cc03d178abb7606ab44c5a111e8f15"} Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.142405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerStarted","Data":"33140f8e1d3439faf5f86e2a9943d2a1a29705de4557ef8c882ab90fcc673de2"} Mar 20 13:42:26 crc kubenswrapper[4895]: W0320 13:42:26.303642 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67441dde_3458_42fb_a8fd_556636ed6613.slice/crio-e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3 WatchSource:0}: Error finding container e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3: Status 404 returned error can't find the container with id e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3 Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.322418 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.411288 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56f588c54c-qdk5g"] Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.605359 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:26 crc kubenswrapper[4895]: I0320 13:42:26.620906 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b7d445cd4-s2zjm"] Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.175921 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerStarted","Data":"e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.218066 4895 generic.go:334] "Generic (PLEG): container finished" podID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerID="0dd97c50eae4b016f75ff6d52a35f8340ac869cfb11bbc8f632f3e50a9b5b0e5" exitCode=0 Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.260795 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1141e6-4801-4733-891c-3e9607c36aca" path="/var/lib/kubelet/pods/4a1141e6-4801-4733-891c-3e9607c36aca/volumes" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.261669 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" event={"ID":"2708316e-74e9-4fcc-948c-02f3c3e712ff","Type":"ContainerDied","Data":"0dd97c50eae4b016f75ff6d52a35f8340ac869cfb11bbc8f632f3e50a9b5b0e5"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.261694 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" event={"ID":"2708316e-74e9-4fcc-948c-02f3c3e712ff","Type":"ContainerStarted","Data":"daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.289286 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerStarted","Data":"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.289342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerStarted","Data":"c463dfb1f9f16fc3e6e8b18214898d27091772c0db2f6235044b8828fff3ec15"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.334941 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lh78p" event={"ID":"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c","Type":"ContainerStarted","Data":"ec31ceca8278d8dfe9f061298e95a8c044f65b3cbc9b3f63a3e7fec5d113cbfe"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.355446 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b99d76fbb-c92mx" event={"ID":"46abdc7f-1f99-44dc-8cc1-3a7c61186946","Type":"ContainerStarted","Data":"e5e8ce339daa5b5f18b8232de2d027c369ddb9f3c433bddafa2a2d658562ab1c"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.355489 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b99d76fbb-c92mx" event={"ID":"46abdc7f-1f99-44dc-8cc1-3a7c61186946","Type":"ContainerStarted","Data":"14975d08725b60d6b2896fea652c9050ed20bdee307f2f534ed07ec6b80e5c08"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.356463 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.356492 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.357968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" event={"ID":"ebaa89fb-ad42-4038-a2fa-cbc9d2711354","Type":"ContainerStarted","Data":"0df2ef543e656e4abf147453e1587b548bdd09971c3b7d74d5d6855c0967dcc0"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.358826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56f588c54c-qdk5g" event={"ID":"443e18a5-4a5b-4678-8a19-dca8434a8a31","Type":"ContainerStarted","Data":"57dcd7aec32b9da019a3f49c644572bb668dc3039d80f5ec4bee52dc26c72b87"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.383742 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerStarted","Data":"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.383805 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerStarted","Data":"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.384997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.385027 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.406923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65558fd5f5-5tmzj" event={"ID":"54aa85fa-25cb-409a-be60-4c0cb8468466","Type":"ContainerStarted","Data":"a0bbb7cf97f89a6a74ff909b0297a487a744915c78408b157406f369a2eaa6d6"} Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.407966 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.446324 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b99d76fbb-c92mx" podStartSLOduration=3.446305334 podStartE2EDuration="3.446305334s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:27.431492705 +0000 UTC m=+1246.941211671" watchObservedRunningTime="2026-03-20 13:42:27.446305334 +0000 UTC m=+1246.956024300" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.446476 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-lh78p" podStartSLOduration=2.905677972 podStartE2EDuration="54.446471608s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="2026-03-20 13:41:34.941194825 +0000 UTC m=+1194.450913791" lastFinishedPulling="2026-03-20 13:42:26.481988461 +0000 UTC m=+1245.991707427" observedRunningTime="2026-03-20 13:42:27.38186762 +0000 UTC m=+1246.891586586" watchObservedRunningTime="2026-03-20 13:42:27.446471608 +0000 UTC m=+1246.956190574" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.501999 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-65558fd5f5-5tmzj" podStartSLOduration=4.501973888 podStartE2EDuration="4.501973888s" podCreationTimestamp="2026-03-20 13:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:27.483009633 +0000 UTC m=+1246.992728609" watchObservedRunningTime="2026-03-20 13:42:27.501973888 +0000 UTC m=+1247.011692854" Mar 20 13:42:27 crc kubenswrapper[4895]: I0320 13:42:27.524819 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-85d989d55b-spf52" podStartSLOduration=3.52480083 podStartE2EDuration="3.52480083s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:27.523756696 +0000 UTC m=+1247.033475662" watchObservedRunningTime="2026-03-20 13:42:27.52480083 +0000 UTC m=+1247.034519806" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.426479 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84c5f65f5b-7c6jl"] Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.435198 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.450424 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.450647 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.476772 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84c5f65f5b-7c6jl"] Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.495352 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" event={"ID":"2708316e-74e9-4fcc-948c-02f3c3e712ff","Type":"ContainerStarted","Data":"f3fe88ecd6d7e1c1ca06b312baed9c327b460504a0195a99f2642a02a3f1e305"} Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.500588 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.511623 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerStarted","Data":"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686"} Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.570274 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" podStartSLOduration=4.570252289 podStartE2EDuration="4.570252289s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:28.54279461 +0000 UTC m=+1248.052513576" watchObservedRunningTime="2026-03-20 13:42:28.570252289 +0000 UTC m=+1248.079971255" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574537 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-internal-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data-custom\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-public-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574661 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574687 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-logs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574708 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-combined-ca-bundle\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.574772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cws7p\" (UniqueName: \"kubernetes.io/projected/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-kube-api-access-cws7p\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.600202 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cd7655cd6-lh976" podStartSLOduration=4.600180893 podStartE2EDuration="4.600180893s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:28.563045564 +0000 UTC m=+1248.072764530" watchObservedRunningTime="2026-03-20 13:42:28.600180893 +0000 UTC m=+1248.109899859" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676487 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-logs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676557 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-combined-ca-bundle\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676716 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cws7p\" (UniqueName: \"kubernetes.io/projected/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-kube-api-access-cws7p\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676883 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-internal-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676972 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data-custom\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.676994 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-public-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.678625 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-logs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.684754 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-internal-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.684792 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-public-tls-certs\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.688296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-combined-ca-bundle\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.698043 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data-custom\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.708857 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cws7p\" (UniqueName: \"kubernetes.io/projected/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-kube-api-access-cws7p\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.729729 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ae76b1-0feb-45b7-9e94-063bc0c58ded-config-data\") pod \"barbican-api-84c5f65f5b-7c6jl\" (UID: \"f2ae76b1-0feb-45b7-9e94-063bc0c58ded\") " pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:28 crc kubenswrapper[4895]: I0320 13:42:28.823184 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:29 crc kubenswrapper[4895]: I0320 13:42:29.535412 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:29 crc kubenswrapper[4895]: I0320 13:42:29.536133 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:30 crc kubenswrapper[4895]: I0320 13:42:30.555211 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerStarted","Data":"a8a34bb45f13f06b73de9c45ee24079b73349126dc38caec857cffc252977dd2"} Mar 20 13:42:30 crc kubenswrapper[4895]: I0320 13:42:30.557969 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerStarted","Data":"bcbca4e6a607ca47535a89f98e84b2dcdb6162ba12d0567391a7455feda65204"} Mar 20 13:42:30 crc kubenswrapper[4895]: I0320 13:42:30.559730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56f588c54c-qdk5g" event={"ID":"443e18a5-4a5b-4678-8a19-dca8434a8a31","Type":"ContainerStarted","Data":"b73585245f4ac3966db505c85aa8d21b3e538322daa001461fb8cc4bd179fa89"} Mar 20 13:42:30 crc kubenswrapper[4895]: I0320 13:42:30.562299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" event={"ID":"ebaa89fb-ad42-4038-a2fa-cbc9d2711354","Type":"ContainerStarted","Data":"1771ffd05733f84b6db48e9b9cfef105d5443b20f0492a9d7a8536af084e866f"} Mar 20 13:42:30 crc kubenswrapper[4895]: I0320 13:42:30.607632 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84c5f65f5b-7c6jl"] Mar 20 13:42:30 crc kubenswrapper[4895]: W0320 13:42:30.642151 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2ae76b1_0feb_45b7_9e94_063bc0c58ded.slice/crio-4c5e704188bfc1b4348d24461a61a06da036b91724f9366e6bb06387ead4cfb1 WatchSource:0}: Error finding container 4c5e704188bfc1b4348d24461a61a06da036b91724f9366e6bb06387ead4cfb1: Status 404 returned error can't find the container with id 4c5e704188bfc1b4348d24461a61a06da036b91724f9366e6bb06387ead4cfb1 Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.578047 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerStarted","Data":"a0106560817efe9b39b9f990c75ca23306d19c839c82ec7c9734edf6cfeb45b6"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.581679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56f588c54c-qdk5g" event={"ID":"443e18a5-4a5b-4678-8a19-dca8434a8a31","Type":"ContainerStarted","Data":"f89244061f989890c3c942c7b12990cc3c00cfad1d7ffb042402bc62a85c4f51"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.584367 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" event={"ID":"ebaa89fb-ad42-4038-a2fa-cbc9d2711354","Type":"ContainerStarted","Data":"c04bd85eba5fc33ded5cf54c3179d61e625179c0a358933481583a43c812a792"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.587683 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerStarted","Data":"0ffe15514115b377c955503f6010d0347be260be2de9ff5930943a107a60892e"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.591066 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84c5f65f5b-7c6jl" event={"ID":"f2ae76b1-0feb-45b7-9e94-063bc0c58ded","Type":"ContainerStarted","Data":"194d7b309f5088243b85d59ea3ab87a61789ec18af2fe88d99c31d7b1ce7a403"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.591138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84c5f65f5b-7c6jl" event={"ID":"f2ae76b1-0feb-45b7-9e94-063bc0c58ded","Type":"ContainerStarted","Data":"ead0f0104f8fdee2fcb232bd4c5d50f9bf86e9e129bfefe7168c31722bea6329"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.591302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.591326 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84c5f65f5b-7c6jl" event={"ID":"f2ae76b1-0feb-45b7-9e94-063bc0c58ded","Type":"ContainerStarted","Data":"4c5e704188bfc1b4348d24461a61a06da036b91724f9366e6bb06387ead4cfb1"} Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.591340 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.604749 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" podStartSLOduration=3.22101215 podStartE2EDuration="7.604730676s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="2026-03-20 13:42:25.758778085 +0000 UTC m=+1245.268497051" lastFinishedPulling="2026-03-20 13:42:30.142496611 +0000 UTC m=+1249.652215577" observedRunningTime="2026-03-20 13:42:31.594782928 +0000 UTC m=+1251.104501894" watchObservedRunningTime="2026-03-20 13:42:31.604730676 +0000 UTC m=+1251.114449642" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.661966 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56f588c54c-qdk5g" podStartSLOduration=3.924062954 podStartE2EDuration="7.661950685s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="2026-03-20 13:42:26.413977485 +0000 UTC m=+1245.923696451" lastFinishedPulling="2026-03-20 13:42:30.151865216 +0000 UTC m=+1249.661584182" observedRunningTime="2026-03-20 13:42:31.622754508 +0000 UTC m=+1251.132473494" watchObservedRunningTime="2026-03-20 13:42:31.661950685 +0000 UTC m=+1251.171669651" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.668904 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.673616 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84c5f65f5b-7c6jl" podStartSLOduration=3.673600281 podStartE2EDuration="3.673600281s" podCreationTimestamp="2026-03-20 13:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:31.653932921 +0000 UTC m=+1251.163651897" watchObservedRunningTime="2026-03-20 13:42:31.673600281 +0000 UTC m=+1251.183319247" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.690208 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" podStartSLOduration=3.861548075 podStartE2EDuration="7.690190591s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="2026-03-20 13:42:26.320004025 +0000 UTC m=+1245.829722991" lastFinishedPulling="2026-03-20 13:42:30.148646541 +0000 UTC m=+1249.658365507" observedRunningTime="2026-03-20 13:42:31.679060116 +0000 UTC m=+1251.188779082" watchObservedRunningTime="2026-03-20 13:42:31.690190591 +0000 UTC m=+1251.199909557" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.720546 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b7d445cd4-s2zjm" podStartSLOduration=4.251858703 podStartE2EDuration="7.720524605s" podCreationTimestamp="2026-03-20 13:42:24 +0000 UTC" firstStartedPulling="2026-03-20 13:42:26.683602383 +0000 UTC m=+1246.193321349" lastFinishedPulling="2026-03-20 13:42:30.152268285 +0000 UTC m=+1249.661987251" observedRunningTime="2026-03-20 13:42:31.699075765 +0000 UTC m=+1251.208794731" watchObservedRunningTime="2026-03-20 13:42:31.720524605 +0000 UTC m=+1251.230243571" Mar 20 13:42:31 crc kubenswrapper[4895]: I0320 13:42:31.758823 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:32 crc kubenswrapper[4895]: I0320 13:42:32.613032 4895 generic.go:334] "Generic (PLEG): container finished" podID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" containerID="ec31ceca8278d8dfe9f061298e95a8c044f65b3cbc9b3f63a3e7fec5d113cbfe" exitCode=0 Mar 20 13:42:32 crc kubenswrapper[4895]: I0320 13:42:32.613123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lh78p" event={"ID":"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c","Type":"ContainerDied","Data":"ec31ceca8278d8dfe9f061298e95a8c044f65b3cbc9b3f63a3e7fec5d113cbfe"} Mar 20 13:42:32 crc kubenswrapper[4895]: I0320 13:42:32.616608 4895 generic.go:334] "Generic (PLEG): container finished" podID="19fedca4-15c2-4975-807e-e0c9ded7f329" containerID="0a4a6f07e6900137b6e30afe86a56ebdb7db9c6e4fbd5df523646bb2ba158250" exitCode=0 Mar 20 13:42:32 crc kubenswrapper[4895]: I0320 13:42:32.616638 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57bfc" event={"ID":"19fedca4-15c2-4975-807e-e0c9ded7f329","Type":"ContainerDied","Data":"0a4a6f07e6900137b6e30afe86a56ebdb7db9c6e4fbd5df523646bb2ba158250"} Mar 20 13:42:33 crc kubenswrapper[4895]: I0320 13:42:33.631248 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker" containerID="cri-o://a0106560817efe9b39b9f990c75ca23306d19c839c82ec7c9734edf6cfeb45b6" gracePeriod=30 Mar 20 13:42:33 crc kubenswrapper[4895]: I0320 13:42:33.631014 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker-log" containerID="cri-o://bcbca4e6a607ca47535a89f98e84b2dcdb6162ba12d0567391a7455feda65204" gracePeriod=30 Mar 20 13:42:33 crc kubenswrapper[4895]: I0320 13:42:33.631686 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener-log" containerID="cri-o://a8a34bb45f13f06b73de9c45ee24079b73349126dc38caec857cffc252977dd2" gracePeriod=30 Mar 20 13:42:33 crc kubenswrapper[4895]: I0320 13:42:33.631696 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener" containerID="cri-o://0ffe15514115b377c955503f6010d0347be260be2de9ff5930943a107a60892e" gracePeriod=30 Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.409972 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.422693 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57bfc" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.511503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data\") pod \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.511600 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs\") pod \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.511666 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz2bl\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl\") pod \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.511736 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle\") pod \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.511792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts\") pod \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\" (UID: \"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.517699 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs" (OuterVolumeSpecName: "certs") pod "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" (UID: "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.521238 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl" (OuterVolumeSpecName: "kube-api-access-tz2bl") pod "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" (UID: "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c"). InnerVolumeSpecName "kube-api-access-tz2bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.522779 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts" (OuterVolumeSpecName: "scripts") pod "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" (UID: "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.539086 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" (UID: "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.541707 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data" (OuterVolumeSpecName: "config-data") pod "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" (UID: "8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614443 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614516 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614613 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614705 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt7gs\" (UniqueName: \"kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614750 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.614798 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id\") pod \"19fedca4-15c2-4975-807e-e0c9ded7f329\" (UID: \"19fedca4-15c2-4975-807e-e0c9ded7f329\") " Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615653 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615679 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615692 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz2bl\" (UniqueName: \"kubernetes.io/projected/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-kube-api-access-tz2bl\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615704 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615714 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.615772 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.617713 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.618884 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs" (OuterVolumeSpecName: "kube-api-access-dt7gs") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "kube-api-access-dt7gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.620155 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts" (OuterVolumeSpecName: "scripts") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.641838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-lh78p" event={"ID":"8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c","Type":"ContainerDied","Data":"c0ff89c98dcdcc42fcf84e9bca3783b909810286ae71fc37d8f7faa1f388120f"} Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.641891 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ff89c98dcdcc42fcf84e9bca3783b909810286ae71fc37d8f7faa1f388120f" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.641964 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-lh78p" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.658269 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.659229 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-57bfc" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.659789 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-57bfc" event={"ID":"19fedca4-15c2-4975-807e-e0c9ded7f329","Type":"ContainerDied","Data":"b324d30244f6903b16e7e36822d27e18242390948e880e84738bde73e3467d99"} Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.659815 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b324d30244f6903b16e7e36822d27e18242390948e880e84738bde73e3467d99" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.665732 4895 generic.go:334] "Generic (PLEG): container finished" podID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerID="bcbca4e6a607ca47535a89f98e84b2dcdb6162ba12d0567391a7455feda65204" exitCode=143 Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.665842 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerDied","Data":"bcbca4e6a607ca47535a89f98e84b2dcdb6162ba12d0567391a7455feda65204"} Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.667927 4895 generic.go:334] "Generic (PLEG): container finished" podID="67441dde-3458-42fb-a8fd-556636ed6613" containerID="a8a34bb45f13f06b73de9c45ee24079b73349126dc38caec857cffc252977dd2" exitCode=143 Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.667955 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerDied","Data":"a8a34bb45f13f06b73de9c45ee24079b73349126dc38caec857cffc252977dd2"} Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.708932 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data" (OuterVolumeSpecName: "config-data") pod "19fedca4-15c2-4975-807e-e0c9ded7f329" (UID: "19fedca4-15c2-4975-807e-e0c9ded7f329"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717693 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717729 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt7gs\" (UniqueName: \"kubernetes.io/projected/19fedca4-15c2-4975-807e-e0c9ded7f329-kube-api-access-dt7gs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717740 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717750 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19fedca4-15c2-4975-807e-e0c9ded7f329-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717760 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.717768 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fedca4-15c2-4975-807e-e0c9ded7f329-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.816819 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-wbsfq"] Mar 20 13:42:34 crc kubenswrapper[4895]: E0320 13:42:34.817331 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" containerName="cinder-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.817354 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" containerName="cinder-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: E0320 13:42:34.817374 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" containerName="cloudkitty-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.817382 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" containerName="cloudkitty-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.817765 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" containerName="cloudkitty-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.817792 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" containerName="cinder-db-sync" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.818731 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.829010 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-wbsfq"] Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.843824 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.844060 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.844214 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.844353 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pbltf" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.845020 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.924617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.924757 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.924788 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.924819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:34 crc kubenswrapper[4895]: I0320 13:42:34.924839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wh8\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.012317 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.014115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.019481 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.019678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6kpmz" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.019783 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.019876 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.029171 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.029342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.029376 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.029431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.029448 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wh8\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.038361 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.042980 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.045564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.050312 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.050713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.072650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wh8\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8\") pod \"cloudkitty-storageinit-wbsfq\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.087900 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.088183 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="dnsmasq-dns" containerID="cri-o://f3fe88ecd6d7e1c1ca06b312baed9c327b460504a0195a99f2642a02a3f1e305" gracePeriod=10 Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.090076 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132189 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132523 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c256\" (UniqueName: \"kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132679 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.132721 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.160557 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.162260 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.162737 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.179143 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.184589 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.186288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.189271 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.208176 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235090 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235142 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235182 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235243 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c256\" (UniqueName: \"kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235283 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.235320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.236593 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.242444 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.244409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.245302 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.245361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.255327 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c256\" (UniqueName: \"kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256\") pod \"cinder-scheduler-0\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337077 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337170 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337264 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337293 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337338 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337432 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337464 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337483 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xng7m\" (UniqueName: \"kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337511 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337562 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85l47\" (UniqueName: \"kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.337591 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.356273 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.387807 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: connect: connection refused" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.439937 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440005 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440182 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440206 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xng7m\" (UniqueName: \"kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440346 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85l47\" (UniqueName: \"kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440372 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440524 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.440544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.441143 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.442300 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.442870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.443009 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.444067 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.447213 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.447647 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.447841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.448520 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.458416 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.460007 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xng7m\" (UniqueName: \"kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m\") pod \"dnsmasq-dns-6578955fd5-bv7ct\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.460943 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85l47\" (UniqueName: \"kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.463879 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.505283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.515563 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.591040 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.698554 4895 generic.go:334] "Generic (PLEG): container finished" podID="67441dde-3458-42fb-a8fd-556636ed6613" containerID="0ffe15514115b377c955503f6010d0347be260be2de9ff5930943a107a60892e" exitCode=0 Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.698634 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerDied","Data":"0ffe15514115b377c955503f6010d0347be260be2de9ff5930943a107a60892e"} Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.718354 4895 generic.go:334] "Generic (PLEG): container finished" podID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerID="f3fe88ecd6d7e1c1ca06b312baed9c327b460504a0195a99f2642a02a3f1e305" exitCode=0 Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.718484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" event={"ID":"2708316e-74e9-4fcc-948c-02f3c3e712ff","Type":"ContainerDied","Data":"f3fe88ecd6d7e1c1ca06b312baed9c327b460504a0195a99f2642a02a3f1e305"} Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.745666 4895 generic.go:334] "Generic (PLEG): container finished" podID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerID="a0106560817efe9b39b9f990c75ca23306d19c839c82ec7c9734edf6cfeb45b6" exitCode=0 Mar 20 13:42:35 crc kubenswrapper[4895]: I0320 13:42:35.745710 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerDied","Data":"a0106560817efe9b39b9f990c75ca23306d19c839c82ec7c9734edf6cfeb45b6"} Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.048633 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.049086 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b745c9b4c-q9rhb" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-api" containerID="cri-o://05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3" gracePeriod=30 Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.049207 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b745c9b4c-q9rhb" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" containerID="cri-o://0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9" gracePeriod=30 Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.100552 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56bf665d85-xzq8s"] Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.102571 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.127784 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bf665d85-xzq8s"] Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.163555 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6b745c9b4c-q9rhb" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9696/\": read tcp 10.217.0.2:53642->10.217.0.178:9696: read: connection reset by peer" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259711 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-ovndb-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259779 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-combined-ca-bundle\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259807 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-public-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259904 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktvq\" (UniqueName: \"kubernetes.io/projected/8eecbe8c-a839-4641-b617-921265cd8f14-kube-api-access-mktvq\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-internal-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.259974 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-httpd-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktvq\" (UniqueName: \"kubernetes.io/projected/8eecbe8c-a839-4641-b617-921265cd8f14-kube-api-access-mktvq\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361750 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-internal-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361808 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-httpd-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-ovndb-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-combined-ca-bundle\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.361958 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-public-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.370976 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-ovndb-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.375177 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.378347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-httpd-config\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.378519 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-public-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.379120 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-internal-tls-certs\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.382419 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktvq\" (UniqueName: \"kubernetes.io/projected/8eecbe8c-a839-4641-b617-921265cd8f14-kube-api-access-mktvq\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.382639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eecbe8c-a839-4641-b617-921265cd8f14-combined-ca-bundle\") pod \"neutron-56bf665d85-xzq8s\" (UID: \"8eecbe8c-a839-4641-b617-921265cd8f14\") " pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.453783 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.789373 4895 generic.go:334] "Generic (PLEG): container finished" podID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerID="0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9" exitCode=0 Mar 20 13:42:36 crc kubenswrapper[4895]: I0320 13:42:36.789679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerDied","Data":"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9"} Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.145650 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.590778 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.754257 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.781020 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6b745c9b4c-q9rhb" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9696/\": dial tcp 10.217.0.178:9696: connect: connection refused" Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.838652 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" event={"ID":"ab625b59-c43a-498a-b79d-c7952511fe4e","Type":"ContainerDied","Data":"33140f8e1d3439faf5f86e2a9943d2a1a29705de4557ef8c882ab90fcc673de2"} Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.838695 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33140f8e1d3439faf5f86e2a9943d2a1a29705de4557ef8c882ab90fcc673de2" Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.844110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" event={"ID":"67441dde-3458-42fb-a8fd-556636ed6613","Type":"ContainerDied","Data":"e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3"} Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.844154 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3409576438ea9136e7234f18fcc8af2dd920d0738aa6d6bac722c805184b3e3" Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.865033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" event={"ID":"2708316e-74e9-4fcc-948c-02f3c3e712ff","Type":"ContainerDied","Data":"daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076"} Mar 20 13:42:37 crc kubenswrapper[4895]: I0320 13:42:37.865065 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daff72ab30a08055c4d1e1018572b6912e4b738deb4b14adf02b972b588c2076" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.102461 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.133237 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.150949 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.224642 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwtgh\" (UniqueName: \"kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh\") pod \"67441dde-3458-42fb-a8fd-556636ed6613\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.224708 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle\") pod \"67441dde-3458-42fb-a8fd-556636ed6613\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.227417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4p7\" (UniqueName: \"kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.227548 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt8rc\" (UniqueName: \"kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc\") pod \"ab625b59-c43a-498a-b79d-c7952511fe4e\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.227929 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.227972 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data\") pod \"67441dde-3458-42fb-a8fd-556636ed6613\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228061 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs\") pod \"ab625b59-c43a-498a-b79d-c7952511fe4e\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228085 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom\") pod \"67441dde-3458-42fb-a8fd-556636ed6613\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228119 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom\") pod \"ab625b59-c43a-498a-b79d-c7952511fe4e\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228137 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228155 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228227 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle\") pod \"ab625b59-c43a-498a-b79d-c7952511fe4e\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0\") pod \"2708316e-74e9-4fcc-948c-02f3c3e712ff\" (UID: \"2708316e-74e9-4fcc-948c-02f3c3e712ff\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228295 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data\") pod \"ab625b59-c43a-498a-b79d-c7952511fe4e\" (UID: \"ab625b59-c43a-498a-b79d-c7952511fe4e\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.228314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs\") pod \"67441dde-3458-42fb-a8fd-556636ed6613\" (UID: \"67441dde-3458-42fb-a8fd-556636ed6613\") " Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.232566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh" (OuterVolumeSpecName: "kube-api-access-kwtgh") pod "67441dde-3458-42fb-a8fd-556636ed6613" (UID: "67441dde-3458-42fb-a8fd-556636ed6613"). InnerVolumeSpecName "kube-api-access-kwtgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.232913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs" (OuterVolumeSpecName: "logs") pod "ab625b59-c43a-498a-b79d-c7952511fe4e" (UID: "ab625b59-c43a-498a-b79d-c7952511fe4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.234667 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs" (OuterVolumeSpecName: "logs") pod "67441dde-3458-42fb-a8fd-556636ed6613" (UID: "67441dde-3458-42fb-a8fd-556636ed6613"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.248184 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7" (OuterVolumeSpecName: "kube-api-access-xk4p7") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "kube-api-access-xk4p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.277557 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "67441dde-3458-42fb-a8fd-556636ed6613" (UID: "67441dde-3458-42fb-a8fd-556636ed6613"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.281878 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc" (OuterVolumeSpecName: "kube-api-access-kt8rc") pod "ab625b59-c43a-498a-b79d-c7952511fe4e" (UID: "ab625b59-c43a-498a-b79d-c7952511fe4e"). InnerVolumeSpecName "kube-api-access-kt8rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.281994 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab625b59-c43a-498a-b79d-c7952511fe4e" (UID: "ab625b59-c43a-498a-b79d-c7952511fe4e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.305500 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67441dde-3458-42fb-a8fd-556636ed6613" (UID: "67441dde-3458-42fb-a8fd-556636ed6613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.326069 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331758 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwtgh\" (UniqueName: \"kubernetes.io/projected/67441dde-3458-42fb-a8fd-556636ed6613-kube-api-access-kwtgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331838 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331856 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4p7\" (UniqueName: \"kubernetes.io/projected/2708316e-74e9-4fcc-948c-02f3c3e712ff-kube-api-access-xk4p7\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331869 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt8rc\" (UniqueName: \"kubernetes.io/projected/ab625b59-c43a-498a-b79d-c7952511fe4e-kube-api-access-kt8rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331883 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab625b59-c43a-498a-b79d-c7952511fe4e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331926 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331940 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331953 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.331965 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67441dde-3458-42fb-a8fd-556636ed6613-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.337549 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data" (OuterVolumeSpecName: "config-data") pod "ab625b59-c43a-498a-b79d-c7952511fe4e" (UID: "ab625b59-c43a-498a-b79d-c7952511fe4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.372095 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data" (OuterVolumeSpecName: "config-data") pod "67441dde-3458-42fb-a8fd-556636ed6613" (UID: "67441dde-3458-42fb-a8fd-556636ed6613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.372109 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.386859 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.418259 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab625b59-c43a-498a-b79d-c7952511fe4e" (UID: "ab625b59-c43a-498a-b79d-c7952511fe4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.437166 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67441dde-3458-42fb-a8fd-556636ed6613-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.437197 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.437207 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.437217 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.437227 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab625b59-c43a-498a-b79d-c7952511fe4e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.440065 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.440185 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config" (OuterVolumeSpecName: "config") pod "2708316e-74e9-4fcc-948c-02f3c3e712ff" (UID: "2708316e-74e9-4fcc-948c-02f3c3e712ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.538604 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.538630 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2708316e-74e9-4fcc-948c-02f3c3e712ff-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.659894 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-wbsfq"] Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.681788 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:38 crc kubenswrapper[4895]: W0320 13:42:38.683497 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8762b46_24e0_478c_a4ca_61b8db29481b.slice/crio-5240f72dc76a1a4d2d216c52d732d39c6ede8e6eb70145773561982afba3da3a WatchSource:0}: Error finding container 5240f72dc76a1a4d2d216c52d732d39c6ede8e6eb70145773561982afba3da3a: Status 404 returned error can't find the container with id 5240f72dc76a1a4d2d216c52d732d39c6ede8e6eb70145773561982afba3da3a Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.871346 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.913248 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerStarted","Data":"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c"} Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925566 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-central-agent" containerID="cri-o://9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854" gracePeriod=30 Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925661 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="proxy-httpd" containerID="cri-o://28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c" gracePeriod=30 Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925701 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="sg-core" containerID="cri-o://99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25" gracePeriod=30 Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925734 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-notification-agent" containerID="cri-o://48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e" gracePeriod=30 Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.925801 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.932815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerStarted","Data":"5240f72dc76a1a4d2d216c52d732d39c6ede8e6eb70145773561982afba3da3a"} Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.939948 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-wlmw6" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.941234 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8ccd8f54c-5rvhq" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.941350 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77fb447c54-w8vrq" Mar 20 13:42:38 crc kubenswrapper[4895]: I0320 13:42:38.941558 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wbsfq" event={"ID":"eb790d89-50de-47f6-9361-0c2f1bf39636","Type":"ContainerStarted","Data":"ef36cf7f5f6d3e30476b30259d26c8c70c964da9b5e1d94535858969040a9c20"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:38.968373 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.983944572 podStartE2EDuration="1m5.96835211s" podCreationTimestamp="2026-03-20 13:41:33 +0000 UTC" firstStartedPulling="2026-03-20 13:41:34.820944124 +0000 UTC m=+1194.330663100" lastFinishedPulling="2026-03-20 13:42:37.805351672 +0000 UTC m=+1257.315070638" observedRunningTime="2026-03-20 13:42:38.949819326 +0000 UTC m=+1258.459538292" watchObservedRunningTime="2026-03-20 13:42:38.96835211 +0000 UTC m=+1258.478071076" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.018502 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bf665d85-xzq8s"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.639067 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.649941 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.662037 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-wlmw6"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.670552 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.680999 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-8ccd8f54c-5rvhq"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.693840 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.700129 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-77fb447c54-w8vrq"] Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777108 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777155 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777318 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777367 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxlx\" (UniqueName: \"kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.777415 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config\") pod \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\" (UID: \"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2\") " Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.788271 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.799285 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx" (OuterVolumeSpecName: "kube-api-access-djxlx") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "kube-api-access-djxlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.880228 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxlx\" (UniqueName: \"kubernetes.io/projected/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-kube-api-access-djxlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.880633 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.956872 4895 generic.go:334] "Generic (PLEG): container finished" podID="b183be69-2ea8-4753-a58d-190aa454c73c" containerID="28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c" exitCode=0 Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.956951 4895 generic.go:334] "Generic (PLEG): container finished" podID="b183be69-2ea8-4753-a58d-190aa454c73c" containerID="99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25" exitCode=2 Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.956960 4895 generic.go:334] "Generic (PLEG): container finished" podID="b183be69-2ea8-4753-a58d-190aa454c73c" containerID="9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854" exitCode=0 Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.956999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerDied","Data":"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.957023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerDied","Data":"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.957033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerDied","Data":"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.959411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerStarted","Data":"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.960785 4895 generic.go:334] "Generic (PLEG): container finished" podID="44d435ed-069c-4447-845d-e957cc94e498" containerID="1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2" exitCode=0 Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.960823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" event={"ID":"44d435ed-069c-4447-845d-e957cc94e498","Type":"ContainerDied","Data":"1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.960843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" event={"ID":"44d435ed-069c-4447-845d-e957cc94e498","Type":"ContainerStarted","Data":"f06b38aeed5a8582bfdd6960cb8b265cc6d82e7c7c685c5ac9253103a5b6e31c"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.970094 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wbsfq" event={"ID":"eb790d89-50de-47f6-9361-0c2f1bf39636","Type":"ContainerStarted","Data":"dea42e680acad4d05684d3cac05722c24e32c1f8e0daf0745198451e0e2860c3"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.972690 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerStarted","Data":"dac46fd2dd9b382e0fd42a3ac008a54baffbf0513fb967aacfe4e2eaedaba4c0"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.972889 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.975543 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config" (OuterVolumeSpecName: "config") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.984200 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bf665d85-xzq8s" event={"ID":"8eecbe8c-a839-4641-b617-921265cd8f14","Type":"ContainerStarted","Data":"5d35157b0305ef5c051a25c7d6ddb093ada8e64482f329b3139284d0de4e1717"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.990279 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.990316 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.992611 4895 generic.go:334] "Generic (PLEG): container finished" podID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerID="05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3" exitCode=0 Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.992648 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerDied","Data":"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.992673 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b745c9b4c-q9rhb" event={"ID":"11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2","Type":"ContainerDied","Data":"5db68a92f4379b74cab069a9a785d2f8b2c33afc2c6ebf023929b0708f811fc2"} Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.992690 4895 scope.go:117] "RemoveContainer" containerID="0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9" Mar 20 13:42:39 crc kubenswrapper[4895]: I0320 13:42:39.992802 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b745c9b4c-q9rhb" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.024765 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-wbsfq" podStartSLOduration=6.02474576 podStartE2EDuration="6.02474576s" podCreationTimestamp="2026-03-20 13:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:40.007243439 +0000 UTC m=+1259.516962405" watchObservedRunningTime="2026-03-20 13:42:40.02474576 +0000 UTC m=+1259.534464726" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.051442 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.096368 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.169453 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.174560 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" (UID: "11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.205701 4895 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.205733 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.254986 4895 scope.go:117] "RemoveContainer" containerID="05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.319877 4895 scope.go:117] "RemoveContainer" containerID="0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9" Mar 20 13:42:40 crc kubenswrapper[4895]: E0320 13:42:40.324133 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9\": container with ID starting with 0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9 not found: ID does not exist" containerID="0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.324168 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9"} err="failed to get container status \"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9\": rpc error: code = NotFound desc = could not find container \"0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9\": container with ID starting with 0207fd09ff035b508bccd7fbab3f5f2cf2b89b49b16fceae81ec23f3120823d9 not found: ID does not exist" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.324188 4895 scope.go:117] "RemoveContainer" containerID="05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3" Mar 20 13:42:40 crc kubenswrapper[4895]: E0320 13:42:40.325106 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3\": container with ID starting with 05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3 not found: ID does not exist" containerID="05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.325123 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3"} err="failed to get container status \"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3\": rpc error: code = NotFound desc = could not find container \"05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3\": container with ID starting with 05977dba751e04735d5b33e5aacb048dcf60d75754c38fe421b56754d2bef0e3 not found: ID does not exist" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.358639 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.368194 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b745c9b4c-q9rhb"] Mar 20 13:42:40 crc kubenswrapper[4895]: E0320 13:42:40.384537 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44d435ed_069c_4447_845d_e957cc94e498.slice/crio-1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.598882 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.693187 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84c5f65f5b-7c6jl" Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.757422 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.757635 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cd7655cd6-lh976" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api-log" containerID="cri-o://67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8" gracePeriod=30 Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.757755 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5cd7655cd6-lh976" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" containerID="cri-o://ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686" gracePeriod=30 Mar 20 13:42:40 crc kubenswrapper[4895]: I0320 13:42:40.771032 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5cd7655cd6-lh976" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": EOF" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.014501 4895 generic.go:334] "Generic (PLEG): container finished" podID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerID="67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8" exitCode=143 Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.015098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerDied","Data":"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8"} Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.042529 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" event={"ID":"44d435ed-069c-4447-845d-e957cc94e498","Type":"ContainerStarted","Data":"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101"} Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.043769 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.052611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerStarted","Data":"e91c68ae535076087507fb3fdb0dc3dd79f7bbe6de3fde2c019c726a2bfaf852"} Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.059839 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bf665d85-xzq8s" event={"ID":"8eecbe8c-a839-4641-b617-921265cd8f14","Type":"ContainerStarted","Data":"fb002e85aa55e714d77f8f2641abae0461b3253bfd922dd125876efd4e288548"} Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.059885 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bf665d85-xzq8s" event={"ID":"8eecbe8c-a839-4641-b617-921265cd8f14","Type":"ContainerStarted","Data":"312b29858608cd98db086469621e47768aa5d38cee4f129943eb2b644367c1be"} Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.060052 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.072748 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" podStartSLOduration=6.072729346 podStartE2EDuration="6.072729346s" podCreationTimestamp="2026-03-20 13:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:41.070546087 +0000 UTC m=+1260.580265053" watchObservedRunningTime="2026-03-20 13:42:41.072729346 +0000 UTC m=+1260.582448313" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.107018 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56bf665d85-xzq8s" podStartSLOduration=5.106999401 podStartE2EDuration="5.106999401s" podCreationTimestamp="2026-03-20 13:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:41.094008274 +0000 UTC m=+1260.603727240" watchObservedRunningTime="2026-03-20 13:42:41.106999401 +0000 UTC m=+1260.616718367" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.236818 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" path="/var/lib/kubelet/pods/11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2/volumes" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.237508 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" path="/var/lib/kubelet/pods/2708316e-74e9-4fcc-948c-02f3c3e712ff/volumes" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.238127 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67441dde-3458-42fb-a8fd-556636ed6613" path="/var/lib/kubelet/pods/67441dde-3458-42fb-a8fd-556636ed6613/volumes" Mar 20 13:42:41 crc kubenswrapper[4895]: I0320 13:42:41.239153 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" path="/var/lib/kubelet/pods/ab625b59-c43a-498a-b79d-c7952511fe4e/volumes" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.085274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerStarted","Data":"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d"} Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.086052 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api-log" containerID="cri-o://513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" gracePeriod=30 Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.086293 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.086566 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api" containerID="cri-o://1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" gracePeriod=30 Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.093785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerStarted","Data":"09f4c28c1b196dbe741fc763423632e937e6ef8c23c9c0c20685e75bb93a6ce0"} Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.126456 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.126440305 podStartE2EDuration="7.126440305s" podCreationTimestamp="2026-03-20 13:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:42.119579188 +0000 UTC m=+1261.629298154" watchObservedRunningTime="2026-03-20 13:42:42.126440305 +0000 UTC m=+1261.636159271" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.156855 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.29139562 podStartE2EDuration="8.156838471s" podCreationTimestamp="2026-03-20 13:42:34 +0000 UTC" firstStartedPulling="2026-03-20 13:42:38.889142998 +0000 UTC m=+1258.398861964" lastFinishedPulling="2026-03-20 13:42:39.754585859 +0000 UTC m=+1259.264304815" observedRunningTime="2026-03-20 13:42:42.149770219 +0000 UTC m=+1261.659489185" watchObservedRunningTime="2026-03-20 13:42:42.156838471 +0000 UTC m=+1261.666557437" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.750647 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.890138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.890521 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85l47\" (UniqueName: \"kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.890824 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.891155 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.891304 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.891426 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.891518 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs\") pod \"e8762b46-24e0-478c-a4ca-61b8db29481b\" (UID: \"e8762b46-24e0-478c-a4ca-61b8db29481b\") " Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.893176 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs" (OuterVolumeSpecName: "logs") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.893469 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.896037 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47" (OuterVolumeSpecName: "kube-api-access-85l47") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "kube-api-access-85l47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.896350 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts" (OuterVolumeSpecName: "scripts") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.908203 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.924129 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.949385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data" (OuterVolumeSpecName: "config-data") pod "e8762b46-24e0-478c-a4ca-61b8db29481b" (UID: "e8762b46-24e0-478c-a4ca-61b8db29481b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996293 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8762b46-24e0-478c-a4ca-61b8db29481b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996715 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996737 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996753 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996772 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8762b46-24e0-478c-a4ca-61b8db29481b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996790 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8762b46-24e0-478c-a4ca-61b8db29481b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:42 crc kubenswrapper[4895]: I0320 13:42:42.996806 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85l47\" (UniqueName: \"kubernetes.io/projected/e8762b46-24e0-478c-a4ca-61b8db29481b-kube-api-access-85l47\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.130340 4895 generic.go:334] "Generic (PLEG): container finished" podID="eb790d89-50de-47f6-9361-0c2f1bf39636" containerID="dea42e680acad4d05684d3cac05722c24e32c1f8e0daf0745198451e0e2860c3" exitCode=0 Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.130431 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wbsfq" event={"ID":"eb790d89-50de-47f6-9361-0c2f1bf39636","Type":"ContainerDied","Data":"dea42e680acad4d05684d3cac05722c24e32c1f8e0daf0745198451e0e2860c3"} Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.133648 4895 generic.go:334] "Generic (PLEG): container finished" podID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerID="1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" exitCode=0 Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.133726 4895 generic.go:334] "Generic (PLEG): container finished" podID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerID="513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" exitCode=143 Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.134628 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.134756 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerDied","Data":"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d"} Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.136458 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerDied","Data":"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd"} Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.136609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8762b46-24e0-478c-a4ca-61b8db29481b","Type":"ContainerDied","Data":"5240f72dc76a1a4d2d216c52d732d39c6ede8e6eb70145773561982afba3da3a"} Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.136737 4895 scope.go:117] "RemoveContainer" containerID="1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.257118 4895 scope.go:117] "RemoveContainer" containerID="513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.349681 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.349730 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.349747 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350150 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="init" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350162 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="init" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350175 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350181 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api-log" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350191 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350197 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker-log" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350212 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350218 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener-log" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350227 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350235 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350248 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350253 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350263 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350268 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350280 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350285 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350296 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="dnsmasq-dns" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350301 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="dnsmasq-dns" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.350309 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-api" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350315 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-api" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350505 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-api" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350523 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350530 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350539 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350547 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2708316e-74e9-4fcc-948c-02f3c3e712ff" containerName="dnsmasq-dns" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350561 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11165ae9-5814-49ea-a5ba-5bc2f7aa7fd2" containerName="neutron-httpd" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350571 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="67441dde-3458-42fb-a8fd-556636ed6613" containerName="barbican-keystone-listener" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350580 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" containerName="cinder-api-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.350591 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab625b59-c43a-498a-b79d-c7952511fe4e" containerName="barbican-worker-log" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.351610 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.354787 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.355171 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.357084 4895 scope.go:117] "RemoveContainer" containerID="1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.364191 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.366820 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d\": container with ID starting with 1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d not found: ID does not exist" containerID="1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.366865 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d"} err="failed to get container status \"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d\": rpc error: code = NotFound desc = could not find container \"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d\": container with ID starting with 1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d not found: ID does not exist" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.366890 4895 scope.go:117] "RemoveContainer" containerID="513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" Mar 20 13:42:43 crc kubenswrapper[4895]: E0320 13:42:43.370593 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd\": container with ID starting with 513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd not found: ID does not exist" containerID="513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.370722 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd"} err="failed to get container status \"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd\": rpc error: code = NotFound desc = could not find container \"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd\": container with ID starting with 513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd not found: ID does not exist" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.370804 4895 scope.go:117] "RemoveContainer" containerID="1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.371303 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d"} err="failed to get container status \"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d\": rpc error: code = NotFound desc = could not find container \"1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d\": container with ID starting with 1f8b4df3f76050ca59150cfd8c8210231110557f3b282a24c5b6504734f1d12d not found: ID does not exist" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.371556 4895 scope.go:117] "RemoveContainer" containerID="513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.371911 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd"} err="failed to get container status \"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd\": rpc error: code = NotFound desc = could not find container \"513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd\": container with ID starting with 513115fb7cbb00e690aff9a1c22cf97fb9d976a036d140f4b0aafafaf818dadd not found: ID does not exist" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.389904 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447629 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447688 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497df96-267d-4b80-8b6b-01fbd8a6477c-logs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447741 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447788 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-scripts\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447831 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447857 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f497df96-267d-4b80-8b6b-01fbd8a6477c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447872 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.447900 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmhzx\" (UniqueName: \"kubernetes.io/projected/f497df96-267d-4b80-8b6b-01fbd8a6477c-kube-api-access-dmhzx\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550003 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550096 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-scripts\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550137 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550168 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f497df96-267d-4b80-8b6b-01fbd8a6477c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550903 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550850 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f497df96-267d-4b80-8b6b-01fbd8a6477c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.550996 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmhzx\" (UniqueName: \"kubernetes.io/projected/f497df96-267d-4b80-8b6b-01fbd8a6477c-kube-api-access-dmhzx\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.551790 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.551821 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.552145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497df96-267d-4b80-8b6b-01fbd8a6477c-logs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.552591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497df96-267d-4b80-8b6b-01fbd8a6477c-logs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.554457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-scripts\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.555166 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.555567 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.556379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.556477 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.559150 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f497df96-267d-4b80-8b6b-01fbd8a6477c-config-data-custom\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.573420 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmhzx\" (UniqueName: \"kubernetes.io/projected/f497df96-267d-4b80-8b6b-01fbd8a6477c-kube-api-access-dmhzx\") pod \"cinder-api-0\" (UID: \"f497df96-267d-4b80-8b6b-01fbd8a6477c\") " pod="openstack/cinder-api-0" Mar 20 13:42:43 crc kubenswrapper[4895]: I0320 13:42:43.693049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.234967 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.728085 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.756693 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.880893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.880986 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4wh8\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8\") pod \"eb790d89-50de-47f6-9361-0c2f1bf39636\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881042 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881055 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts\") pod \"eb790d89-50de-47f6-9361-0c2f1bf39636\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs\") pod \"eb790d89-50de-47f6-9361-0c2f1bf39636\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881136 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data\") pod \"eb790d89-50de-47f6-9361-0c2f1bf39636\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881217 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881233 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881282 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle\") pod \"eb790d89-50de-47f6-9361-0c2f1bf39636\" (UID: \"eb790d89-50de-47f6-9361-0c2f1bf39636\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.881321 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tghnc\" (UniqueName: \"kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc\") pod \"b183be69-2ea8-4753-a58d-190aa454c73c\" (UID: \"b183be69-2ea8-4753-a58d-190aa454c73c\") " Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.882930 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.884114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.888024 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts" (OuterVolumeSpecName: "scripts") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.889562 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc" (OuterVolumeSpecName: "kube-api-access-tghnc") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "kube-api-access-tghnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.892650 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8" (OuterVolumeSpecName: "kube-api-access-p4wh8") pod "eb790d89-50de-47f6-9361-0c2f1bf39636" (UID: "eb790d89-50de-47f6-9361-0c2f1bf39636"). InnerVolumeSpecName "kube-api-access-p4wh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.894698 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs" (OuterVolumeSpecName: "certs") pod "eb790d89-50de-47f6-9361-0c2f1bf39636" (UID: "eb790d89-50de-47f6-9361-0c2f1bf39636"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.918573 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts" (OuterVolumeSpecName: "scripts") pod "eb790d89-50de-47f6-9361-0c2f1bf39636" (UID: "eb790d89-50de-47f6-9361-0c2f1bf39636"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.949862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data" (OuterVolumeSpecName: "config-data") pod "eb790d89-50de-47f6-9361-0c2f1bf39636" (UID: "eb790d89-50de-47f6-9361-0c2f1bf39636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.957269 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb790d89-50de-47f6-9361-0c2f1bf39636" (UID: "eb790d89-50de-47f6-9361-0c2f1bf39636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.975532 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984356 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984384 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4wh8\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-kube-api-access-p4wh8\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984426 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984435 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984445 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eb790d89-50de-47f6-9361-0c2f1bf39636-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984454 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b183be69-2ea8-4753-a58d-190aa454c73c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984463 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984471 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984481 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb790d89-50de-47f6-9361-0c2f1bf39636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:44 crc kubenswrapper[4895]: I0320 13:42:44.984489 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tghnc\" (UniqueName: \"kubernetes.io/projected/b183be69-2ea8-4753-a58d-190aa454c73c-kube-api-access-tghnc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.013128 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.027344 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data" (OuterVolumeSpecName: "config-data") pod "b183be69-2ea8-4753-a58d-190aa454c73c" (UID: "b183be69-2ea8-4753-a58d-190aa454c73c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.086368 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.086419 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b183be69-2ea8-4753-a58d-190aa454c73c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.156766 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-wbsfq" event={"ID":"eb790d89-50de-47f6-9361-0c2f1bf39636","Type":"ContainerDied","Data":"ef36cf7f5f6d3e30476b30259d26c8c70c964da9b5e1d94535858969040a9c20"} Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.156806 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef36cf7f5f6d3e30476b30259d26c8c70c964da9b5e1d94535858969040a9c20" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.156866 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-wbsfq" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.160016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f497df96-267d-4b80-8b6b-01fbd8a6477c","Type":"ContainerStarted","Data":"b0c57dbf4dcb9634a1d9985ea4db0ca24a5cf57437f1eef0bdc4363fe265199c"} Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.160057 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f497df96-267d-4b80-8b6b-01fbd8a6477c","Type":"ContainerStarted","Data":"961eb62b55ec1a6b88ae0c19677d3ed7eccc4af7ffccdbb6033500be4dd38d38"} Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.161737 4895 generic.go:334] "Generic (PLEG): container finished" podID="b183be69-2ea8-4753-a58d-190aa454c73c" containerID="48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e" exitCode=0 Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.161764 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerDied","Data":"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e"} Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.161781 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b183be69-2ea8-4753-a58d-190aa454c73c","Type":"ContainerDied","Data":"2c0d221171549fc3a05e8968e98773ed9d9d9053bcafb4ff210b3a1b3050b02f"} Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.161797 4895 scope.go:117] "RemoveContainer" containerID="28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.161896 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.213139 4895 scope.go:117] "RemoveContainer" containerID="99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.252507 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8762b46-24e0-478c-a4ca-61b8db29481b" path="/var/lib/kubelet/pods/e8762b46-24e0-478c-a4ca-61b8db29481b/volumes" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.302152 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.337561 4895 scope.go:117] "RemoveContainer" containerID="48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.341858 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.357204 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.383537 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.383954 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-central-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.383967 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-central-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.383978 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb790d89-50de-47f6-9361-0c2f1bf39636" containerName="cloudkitty-storageinit" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.383984 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb790d89-50de-47f6-9361-0c2f1bf39636" containerName="cloudkitty-storageinit" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.383996 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="proxy-httpd" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384002 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="proxy-httpd" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.384013 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="sg-core" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384018 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="sg-core" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.384054 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-notification-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384060 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-notification-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384271 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="sg-core" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384283 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-notification-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384289 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="proxy-httpd" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384309 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" containerName="ceilometer-central-agent" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.384322 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb790d89-50de-47f6-9361-0c2f1bf39636" containerName="cloudkitty-storageinit" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.385986 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.394661 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.398481 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.404291 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.467713 4895 scope.go:117] "RemoveContainer" containerID="9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.491697 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.492980 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494680 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494753 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql88g\" (UniqueName: \"kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494849 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494877 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494925 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.494942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.495846 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.496053 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.496152 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.496245 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.496331 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pbltf" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.508633 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.592431 4895 scope.go:117] "RemoveContainer" containerID="28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.598421 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c\": container with ID starting with 28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c not found: ID does not exist" containerID="28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.598466 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c"} err="failed to get container status \"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c\": rpc error: code = NotFound desc = could not find container \"28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c\": container with ID starting with 28e66625cacd528a7ef073d0926029cd6f07bc7aec27c46761a74f66b79c0e0c not found: ID does not exist" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.598498 4895 scope.go:117] "RemoveContainer" containerID="99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.601151 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25\": container with ID starting with 99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25 not found: ID does not exist" containerID="99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.604451 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25"} err="failed to get container status \"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25\": rpc error: code = NotFound desc = could not find container \"99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25\": container with ID starting with 99541745759ec12ab4b5262f69c8e7de707de06ace092f1b57b0cb41228c6a25 not found: ID does not exist" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.604834 4895 scope.go:117] "RemoveContainer" containerID="48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.602157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605296 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605583 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605699 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ctw8\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605843 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.605964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606164 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql88g\" (UniqueName: \"kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606725 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.606957 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.613877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.613897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.620190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.620645 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.622421 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.629295 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e\": container with ID starting with 48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e not found: ID does not exist" containerID="48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.629686 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e"} err="failed to get container status \"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e\": rpc error: code = NotFound desc = could not find container \"48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e\": container with ID starting with 48dd5b37475083601a7dd6720e2c1ceedaac554dea2e2bd70f67d0066017807e not found: ID does not exist" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.629718 4895 scope.go:117] "RemoveContainer" containerID="9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.643223 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: E0320 13:42:45.654688 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854\": container with ID starting with 9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854 not found: ID does not exist" containerID="9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.654734 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854"} err="failed to get container status \"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854\": rpc error: code = NotFound desc = could not find container \"9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854\": container with ID starting with 9ed8e58044fe592b435681f8f27c3976f3ebc6f0455dee4ff7922b4a7b4e3854 not found: ID does not exist" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.655814 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql88g\" (UniqueName: \"kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.655841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.659490 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710602 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710748 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.710819 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ctw8\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.730317 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.732200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.732751 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.744002 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.744772 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.745408 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ctw8\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8\") pod \"cloudkitty-proc-0\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.746924 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.762935 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.764744 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.809792 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.824861 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.880722 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.882490 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.890235 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.890787 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.917913 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7c7\" (UniqueName: \"kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.917963 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.918059 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.918083 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.918099 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.918173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:45 crc kubenswrapper[4895]: I0320 13:42:45.963104 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.025962 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026416 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026474 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026539 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmwf\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026801 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026907 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.026976 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.027006 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7c7\" (UniqueName: \"kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.027034 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.027055 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.027669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.027677 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.028346 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.028417 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.029121 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.050161 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7c7\" (UniqueName: \"kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7\") pod \"dnsmasq-dns-58bd69657f-z9tgz\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.128993 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129063 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129127 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129248 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmwf\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.129518 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.133990 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.134432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.135912 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.137903 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.153692 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.156911 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.159673 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmwf\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf\") pod \"cloudkitty-api-0\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.186004 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f497df96-267d-4b80-8b6b-01fbd8a6477c","Type":"ContainerStarted","Data":"0b7a8526a7725f991fb608f8fcf1796ca8af30e5fcec9aa60d51ba4bfc1e35dd"} Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.187557 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.197670 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="dnsmasq-dns" containerID="cri-o://6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101" gracePeriod=10 Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.225025 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.226922 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.226910591 podStartE2EDuration="3.226910591s" podCreationTimestamp="2026-03-20 13:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:46.218028437 +0000 UTC m=+1265.727747403" watchObservedRunningTime="2026-03-20 13:42:46.226910591 +0000 UTC m=+1265.736629557" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.327825 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.348042 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cd7655cd6-lh976" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:55982->10.217.0.187:9311: read: connection reset by peer" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.355629 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5cd7655cd6-lh976" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:55994->10.217.0.187:9311: read: connection reset by peer" Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.453496 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.467058 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:42:46 crc kubenswrapper[4895]: I0320 13:42:46.710703 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.003295 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.064671 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.112659 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:47 crc kubenswrapper[4895]: W0320 13:42:47.117697 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106336f9_7a5a_4e3d_8973_ca2c5c89109e.slice/crio-799f560ca63e5b66aa9133387767c2fa022dbcefb36d0106da577b6aaf97bc9d WatchSource:0}: Error finding container 799f560ca63e5b66aa9133387767c2fa022dbcefb36d0106da577b6aaf97bc9d: Status 404 returned error can't find the container with id 799f560ca63e5b66aa9133387767c2fa022dbcefb36d0106da577b6aaf97bc9d Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.120609 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.159380 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom\") pod \"0e3c7413-f59c-4cd6-9ba8-868775311f08\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.159452 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data\") pod \"0e3c7413-f59c-4cd6-9ba8-868775311f08\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.159474 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs\") pod \"0e3c7413-f59c-4cd6-9ba8-868775311f08\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.159640 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle\") pod \"0e3c7413-f59c-4cd6-9ba8-868775311f08\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.159733 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn9l4\" (UniqueName: \"kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4\") pod \"0e3c7413-f59c-4cd6-9ba8-868775311f08\" (UID: \"0e3c7413-f59c-4cd6-9ba8-868775311f08\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.162361 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs" (OuterVolumeSpecName: "logs") pod "0e3c7413-f59c-4cd6-9ba8-868775311f08" (UID: "0e3c7413-f59c-4cd6-9ba8-868775311f08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.166696 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0e3c7413-f59c-4cd6-9ba8-868775311f08" (UID: "0e3c7413-f59c-4cd6-9ba8-868775311f08"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.167029 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4" (OuterVolumeSpecName: "kube-api-access-bn9l4") pod "0e3c7413-f59c-4cd6-9ba8-868775311f08" (UID: "0e3c7413-f59c-4cd6-9ba8-868775311f08"). InnerVolumeSpecName "kube-api-access-bn9l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.175338 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.218075 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e3c7413-f59c-4cd6-9ba8-868775311f08" (UID: "0e3c7413-f59c-4cd6-9ba8-868775311f08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.243454 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b183be69-2ea8-4753-a58d-190aa454c73c" path="/var/lib/kubelet/pods/b183be69-2ea8-4753-a58d-190aa454c73c/volumes" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.250935 4895 generic.go:334] "Generic (PLEG): container finished" podID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerID="ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686" exitCode=0 Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.251021 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cd7655cd6-lh976" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.261471 4895 generic.go:334] "Generic (PLEG): container finished" podID="44d435ed-069c-4447-845d-e957cc94e498" containerID="6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101" exitCode=0 Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.261581 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.261932 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.262005 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.262036 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xng7m\" (UniqueName: \"kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.262233 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.262275 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.262293 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb\") pod \"44d435ed-069c-4447-845d-e957cc94e498\" (UID: \"44d435ed-069c-4447-845d-e957cc94e498\") " Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.275040 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.275078 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn9l4\" (UniqueName: \"kubernetes.io/projected/0e3c7413-f59c-4cd6-9ba8-868775311f08-kube-api-access-bn9l4\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.275088 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.275097 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3c7413-f59c-4cd6-9ba8-868775311f08-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.276848 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m" (OuterVolumeSpecName: "kube-api-access-xng7m") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "kube-api-access-xng7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.293369 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="cinder-scheduler" containerID="cri-o://e91c68ae535076087507fb3fdb0dc3dd79f7bbe6de3fde2c019c726a2bfaf852" gracePeriod=30 Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.293882 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="probe" containerID="cri-o://09f4c28c1b196dbe741fc763423632e937e6ef8c23c9c0c20685e75bb93a6ce0" gracePeriod=30 Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.338522 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data" (OuterVolumeSpecName: "config-data") pod "0e3c7413-f59c-4cd6-9ba8-868775311f08" (UID: "0e3c7413-f59c-4cd6-9ba8-868775311f08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.380453 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config" (OuterVolumeSpecName: "config") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.381082 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3c7413-f59c-4cd6-9ba8-868775311f08-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.381119 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.381128 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xng7m\" (UniqueName: \"kubernetes.io/projected/44d435ed-069c-4447-845d-e957cc94e498-kube-api-access-xng7m\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.387323 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.396264 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.397790 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.430279 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44d435ed-069c-4447-845d-e957cc94e498" (UID: "44d435ed-069c-4447-845d-e957cc94e498"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.483184 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.483222 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.483233 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.483242 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44d435ed-069c-4447-845d-e957cc94e498-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543415 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerStarted","Data":"7ee4c1e11df7dac6889d15329b2f3a5f2f2034f42aaad505c46604a17e86e0ac"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"489741aa-97a9-4f41-b94b-7e49b07a27f1","Type":"ContainerStarted","Data":"e28e47c38b270d664f6a2532bfd6bae4d6b1c0d031001f4bbb8118d5d9e864aa"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerDied","Data":"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543492 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cd7655cd6-lh976" event={"ID":"0e3c7413-f59c-4cd6-9ba8-868775311f08","Type":"ContainerDied","Data":"c463dfb1f9f16fc3e6e8b18214898d27091772c0db2f6235044b8828fff3ec15"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543502 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" event={"ID":"44d435ed-069c-4447-845d-e957cc94e498","Type":"ContainerDied","Data":"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543514 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bv7ct" event={"ID":"44d435ed-069c-4447-845d-e957cc94e498","Type":"ContainerDied","Data":"f06b38aeed5a8582bfdd6960cb8b265cc6d82e7c7c685c5ac9253103a5b6e31c"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" event={"ID":"71bfd8c2-e6fb-408a-affd-75569329c598","Type":"ContainerStarted","Data":"28aefa9b85bbe92d279062b66c685753fca74745e82e5971a7c5161b28f53e49"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543524 4895 scope.go:117] "RemoveContainer" containerID="ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.543533 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerStarted","Data":"799f560ca63e5b66aa9133387767c2fa022dbcefb36d0106da577b6aaf97bc9d"} Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.591672 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.596254 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5cd7655cd6-lh976"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.623659 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.635353 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bv7ct"] Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.864376 4895 scope.go:117] "RemoveContainer" containerID="67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.952232 4895 scope.go:117] "RemoveContainer" containerID="ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686" Mar 20 13:42:47 crc kubenswrapper[4895]: E0320 13:42:47.952712 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686\": container with ID starting with ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686 not found: ID does not exist" containerID="ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.952743 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686"} err="failed to get container status \"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686\": rpc error: code = NotFound desc = could not find container \"ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686\": container with ID starting with ab64295f6d69d5d4484520229e6360ebd24f425134e3c61c0be572ad68b2a686 not found: ID does not exist" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.952763 4895 scope.go:117] "RemoveContainer" containerID="67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8" Mar 20 13:42:47 crc kubenswrapper[4895]: E0320 13:42:47.953347 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8\": container with ID starting with 67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8 not found: ID does not exist" containerID="67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.953379 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8"} err="failed to get container status \"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8\": rpc error: code = NotFound desc = could not find container \"67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8\": container with ID starting with 67377260463f8f5153e5444e9c7b78d23f68c93d7af0a5a4ed0f79e42b7492f8 not found: ID does not exist" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.953433 4895 scope.go:117] "RemoveContainer" containerID="6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101" Mar 20 13:42:47 crc kubenswrapper[4895]: I0320 13:42:47.983727 4895 scope.go:117] "RemoveContainer" containerID="1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.055007 4895 scope.go:117] "RemoveContainer" containerID="6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101" Mar 20 13:42:48 crc kubenswrapper[4895]: E0320 13:42:48.056035 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101\": container with ID starting with 6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101 not found: ID does not exist" containerID="6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.056097 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101"} err="failed to get container status \"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101\": rpc error: code = NotFound desc = could not find container \"6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101\": container with ID starting with 6ae64cedaf75ac72ec9383762502375f75d913ede6fc1fb5555a0bfba04c0101 not found: ID does not exist" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.056127 4895 scope.go:117] "RemoveContainer" containerID="1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2" Mar 20 13:42:48 crc kubenswrapper[4895]: E0320 13:42:48.056367 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2\": container with ID starting with 1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2 not found: ID does not exist" containerID="1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.056405 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2"} err="failed to get container status \"1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2\": rpc error: code = NotFound desc = could not find container \"1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2\": container with ID starting with 1d15e49c848e659eb0550ec79fdccf4b4cdace5b316fbfd34b2c9b3059b900a2 not found: ID does not exist" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.273515 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.303441 4895 generic.go:334] "Generic (PLEG): container finished" podID="71bfd8c2-e6fb-408a-affd-75569329c598" containerID="1cc0da441010671eb68be61727574a7b05ee812152dedad32c8ad3962ee196aa" exitCode=0 Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.303508 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" event={"ID":"71bfd8c2-e6fb-408a-affd-75569329c598","Type":"ContainerDied","Data":"1cc0da441010671eb68be61727574a7b05ee812152dedad32c8ad3962ee196aa"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.305681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerStarted","Data":"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.305713 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerStarted","Data":"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.305809 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.307498 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerStarted","Data":"958b04dbda82d360e516f188cdccab8ea6bfcabb6915f55023e4c32671bce962"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.310485 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"489741aa-97a9-4f41-b94b-7e49b07a27f1","Type":"ContainerStarted","Data":"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.313090 4895 generic.go:334] "Generic (PLEG): container finished" podID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerID="09f4c28c1b196dbe741fc763423632e937e6ef8c23c9c0c20685e75bb93a6ce0" exitCode=0 Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.313166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerDied","Data":"09f4c28c1b196dbe741fc763423632e937e6ef8c23c9c0c20685e75bb93a6ce0"} Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.356713 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=1.87679646 podStartE2EDuration="3.356691719s" podCreationTimestamp="2026-03-20 13:42:45 +0000 UTC" firstStartedPulling="2026-03-20 13:42:46.473466402 +0000 UTC m=+1265.983185368" lastFinishedPulling="2026-03-20 13:42:47.953361661 +0000 UTC m=+1267.463080627" observedRunningTime="2026-03-20 13:42:48.35278831 +0000 UTC m=+1267.862507276" watchObservedRunningTime="2026-03-20 13:42:48.356691719 +0000 UTC m=+1267.866410685" Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.396211 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:48 crc kubenswrapper[4895]: I0320 13:42:48.410374 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.410356047 podStartE2EDuration="3.410356047s" podCreationTimestamp="2026-03-20 13:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:48.384829433 +0000 UTC m=+1267.894548399" watchObservedRunningTime="2026-03-20 13:42:48.410356047 +0000 UTC m=+1267.920075013" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.233767 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" path="/var/lib/kubelet/pods/0e3c7413-f59c-4cd6-9ba8-868775311f08/volumes" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.235014 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d435ed-069c-4447-845d-e957cc94e498" path="/var/lib/kubelet/pods/44d435ed-069c-4447-845d-e957cc94e498/volumes" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.383902 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerStarted","Data":"7146ecf46b7155747607ccb8d57de7b0f88296f62062a56ff047cdf1a3b0ee6f"} Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.403619 4895 generic.go:334] "Generic (PLEG): container finished" podID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerID="e91c68ae535076087507fb3fdb0dc3dd79f7bbe6de3fde2c019c726a2bfaf852" exitCode=0 Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.403964 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerDied","Data":"e91c68ae535076087507fb3fdb0dc3dd79f7bbe6de3fde2c019c726a2bfaf852"} Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.443421 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" event={"ID":"71bfd8c2-e6fb-408a-affd-75569329c598","Type":"ContainerStarted","Data":"d814b26ac164dcfecaa734ad8e39248ad6a76743820c119cac765e4d415f361e"} Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.443496 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.443642 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api-log" containerID="cri-o://de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" gracePeriod=30 Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.443704 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api" containerID="cri-o://97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" gracePeriod=30 Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.487172 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" podStartSLOduration=4.487147833 podStartE2EDuration="4.487147833s" podCreationTimestamp="2026-03-20 13:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:49.473845169 +0000 UTC m=+1268.983564135" watchObservedRunningTime="2026-03-20 13:42:49.487147833 +0000 UTC m=+1268.996866799" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.592884 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.637893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.637932 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.638064 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.638092 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.638112 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c256\" (UniqueName: \"kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.638168 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data\") pod \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\" (UID: \"23f4595e-9cd3-47ea-a1d1-9316bccca16a\") " Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.638967 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.643682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256" (OuterVolumeSpecName: "kube-api-access-4c256") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "kube-api-access-4c256". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.644114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts" (OuterVolumeSpecName: "scripts") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.645523 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.731415 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.745738 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.745802 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c256\" (UniqueName: \"kubernetes.io/projected/23f4595e-9cd3-47ea-a1d1-9316bccca16a-kube-api-access-4c256\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.745815 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.745824 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23f4595e-9cd3-47ea-a1d1-9316bccca16a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.745837 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.818373 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data" (OuterVolumeSpecName: "config-data") pod "23f4595e-9cd3-47ea-a1d1-9316bccca16a" (UID: "23f4595e-9cd3-47ea-a1d1-9316bccca16a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:49 crc kubenswrapper[4895]: I0320 13:42:49.847132 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f4595e-9cd3-47ea-a1d1-9316bccca16a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.067003 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xmwf\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153551 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153612 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153652 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153686 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153758 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.153789 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom\") pod \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\" (UID: \"106336f9-7a5a-4e3d-8973-ca2c5c89109e\") " Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.154889 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs" (OuterVolumeSpecName: "logs") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.161762 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf" (OuterVolumeSpecName: "kube-api-access-4xmwf") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "kube-api-access-4xmwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.161892 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.162044 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts" (OuterVolumeSpecName: "scripts") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.162982 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs" (OuterVolumeSpecName: "certs") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.191625 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.221518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data" (OuterVolumeSpecName: "config-data") pod "106336f9-7a5a-4e3d-8973-ca2c5c89109e" (UID: "106336f9-7a5a-4e3d-8973-ca2c5c89109e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255905 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255942 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255952 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255960 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255968 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/106336f9-7a5a-4e3d-8973-ca2c5c89109e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255977 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106336f9-7a5a-4e3d-8973-ca2c5c89109e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.255987 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xmwf\" (UniqueName: \"kubernetes.io/projected/106336f9-7a5a-4e3d-8973-ca2c5c89109e-kube-api-access-4xmwf\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.467298 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerStarted","Data":"0ef231dc8ab5097dac9c38d8ae108baf23e5d5d5910e5b917136b6dc2f81d191"} Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.469820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23f4595e-9cd3-47ea-a1d1-9316bccca16a","Type":"ContainerDied","Data":"dac46fd2dd9b382e0fd42a3ac008a54baffbf0513fb967aacfe4e2eaedaba4c0"} Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.469870 4895 scope.go:117] "RemoveContainer" containerID="09f4c28c1b196dbe741fc763423632e937e6ef8c23c9c0c20685e75bb93a6ce0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.470026 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.505660 4895 generic.go:334] "Generic (PLEG): container finished" podID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerID="97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" exitCode=0 Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506150 4895 generic.go:334] "Generic (PLEG): container finished" podID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerID="de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" exitCode=143 Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506259 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerDied","Data":"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1"} Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506328 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerDied","Data":"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf"} Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506338 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"106336f9-7a5a-4e3d-8973-ca2c5c89109e","Type":"ContainerDied","Data":"799f560ca63e5b66aa9133387767c2fa022dbcefb36d0106da577b6aaf97bc9d"} Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.506517 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="489741aa-97a9-4f41-b94b-7e49b07a27f1" containerName="cloudkitty-proc" containerID="cri-o://8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e" gracePeriod=30 Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.520102 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.547627 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.573782 4895 scope.go:117] "RemoveContainer" containerID="e91c68ae535076087507fb3fdb0dc3dd79f7bbe6de3fde2c019c726a2bfaf852" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.596715 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597341 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="init" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597361 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="init" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597373 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597381 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597416 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="dnsmasq-dns" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597423 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="dnsmasq-dns" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597432 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597438 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597452 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="cinder-scheduler" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597458 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="cinder-scheduler" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597485 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597491 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597500 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="probe" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597506 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="probe" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.597531 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597538 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597831 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="probe" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597849 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597859 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" containerName="cinder-scheduler" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597887 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" containerName="cloudkitty-api" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597901 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597910 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3c7413-f59c-4cd6-9ba8-868775311f08" containerName="barbican-api-log" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.597925 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d435ed-069c-4447-845d-e957cc94e498" containerName="dnsmasq-dns" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.599726 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.612844 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.620498 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.640746 4895 scope.go:117] "RemoveContainer" containerID="97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.672754 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.672954 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.673036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgnz\" (UniqueName: \"kubernetes.io/projected/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-kube-api-access-8xgnz\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.673121 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.673198 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-scripts\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.673358 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.698460 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.707895 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.716238 4895 scope.go:117] "RemoveContainer" containerID="de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.726127 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.727837 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.735993 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.736327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.736513 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.738748 4895 scope.go:117] "RemoveContainer" containerID="97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.740191 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1\": container with ID starting with 97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1 not found: ID does not exist" containerID="97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740232 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1"} err="failed to get container status \"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1\": rpc error: code = NotFound desc = could not find container \"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1\": container with ID starting with 97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1 not found: ID does not exist" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740259 4895 scope.go:117] "RemoveContainer" containerID="de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.740483 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf\": container with ID starting with de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf not found: ID does not exist" containerID="de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740504 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf"} err="failed to get container status \"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf\": rpc error: code = NotFound desc = could not find container \"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf\": container with ID starting with de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf not found: ID does not exist" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740516 4895 scope.go:117] "RemoveContainer" containerID="97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740670 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1"} err="failed to get container status \"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1\": rpc error: code = NotFound desc = could not find container \"97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1\": container with ID starting with 97be486afef9ddc74125d0a32653ebe10b2cd1f85209ed8a6712f7caeff405c1 not found: ID does not exist" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740689 4895 scope.go:117] "RemoveContainer" containerID="de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.740835 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf"} err="failed to get container status \"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf\": rpc error: code = NotFound desc = could not find container \"de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf\": container with ID starting with de0a8448ae948de4a5e7262d1584745a72c667053b0ece1bf6c2017774217daf not found: ID does not exist" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.757707 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775356 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqb6v\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775443 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775550 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775657 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgnz\" (UniqueName: \"kubernetes.io/projected/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-kube-api-access-8xgnz\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775725 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-scripts\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.775748 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.779501 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.783605 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-scripts\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.788960 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.794596 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: E0320 13:42:50.796276 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f4595e_9cd3_47ea_a1d1_9316bccca16a.slice/crio-dac46fd2dd9b382e0fd42a3ac008a54baffbf0513fb967aacfe4e2eaedaba4c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106336f9_7a5a_4e3d_8973_ca2c5c89109e.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.799884 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgnz\" (UniqueName: \"kubernetes.io/projected/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-kube-api-access-8xgnz\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.815040 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1547aa-5cfe-4a70-bd45-0d06101f0e74-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7c1547aa-5cfe-4a70-bd45-0d06101f0e74\") " pod="openstack/cinder-scheduler-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877376 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877417 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877508 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877576 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqb6v\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877594 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.877626 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.878006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.883403 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.883769 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.884175 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.884530 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.896862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.900880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqb6v\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.902902 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.905867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts\") pod \"cloudkitty-api-0\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " pod="openstack/cloudkitty-api-0" Mar 20 13:42:50 crc kubenswrapper[4895]: I0320 13:42:50.961574 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:42:51 crc kubenswrapper[4895]: I0320 13:42:51.053485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:42:51 crc kubenswrapper[4895]: I0320 13:42:51.252718 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106336f9-7a5a-4e3d-8973-ca2c5c89109e" path="/var/lib/kubelet/pods/106336f9-7a5a-4e3d-8973-ca2c5c89109e/volumes" Mar 20 13:42:51 crc kubenswrapper[4895]: I0320 13:42:51.259847 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f4595e-9cd3-47ea-a1d1-9316bccca16a" path="/var/lib/kubelet/pods/23f4595e-9cd3-47ea-a1d1-9316bccca16a/volumes" Mar 20 13:42:51 crc kubenswrapper[4895]: I0320 13:42:51.539862 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:42:51 crc kubenswrapper[4895]: I0320 13:42:51.697841 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:42:51 crc kubenswrapper[4895]: W0320 13:42:51.705711 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1850d10_9153_42cc_93de_ef76e2d9a8c1.slice/crio-ea12950ed42a8cb692af598414c1d40bc22a29c033c07a5eda7640b39a839ded WatchSource:0}: Error finding container ea12950ed42a8cb692af598414c1d40bc22a29c033c07a5eda7640b39a839ded: Status 404 returned error can't find the container with id ea12950ed42a8cb692af598414c1d40bc22a29c033c07a5eda7640b39a839ded Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.564108 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7c1547aa-5cfe-4a70-bd45-0d06101f0e74","Type":"ContainerStarted","Data":"7d432eab2c2ea7d1f73dc8c507d9e09ddf8cbbeff6bed50bf9dcc21baa7dadd7"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.564408 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7c1547aa-5cfe-4a70-bd45-0d06101f0e74","Type":"ContainerStarted","Data":"52a09744162025abed2dfbce7755aa4ba7e3dbf0255ea2ad073d5074ad3ee4c3"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.567129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerStarted","Data":"6a31c32c86d08b7269edf1a12637c7c02f606ddf536e7551a2815a2112193001"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.567236 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.571102 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerStarted","Data":"846aa22b798b8c9b22182359f4929a0a90e89a0ed0b32e24bdd54e683361d419"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.571132 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerStarted","Data":"8b4d5890ef9b3f9c1c231bdec2d0c5de2fc4ce600260b1bfc48ce81b6162b5ad"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.571141 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerStarted","Data":"ea12950ed42a8cb692af598414c1d40bc22a29c033c07a5eda7640b39a839ded"} Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.571249 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.598607 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.53462944 podStartE2EDuration="7.59859335s" podCreationTimestamp="2026-03-20 13:42:45 +0000 UTC" firstStartedPulling="2026-03-20 13:42:46.487246397 +0000 UTC m=+1265.996965363" lastFinishedPulling="2026-03-20 13:42:51.551210307 +0000 UTC m=+1271.060929273" observedRunningTime="2026-03-20 13:42:52.589434301 +0000 UTC m=+1272.099153277" watchObservedRunningTime="2026-03-20 13:42:52.59859335 +0000 UTC m=+1272.108312316" Mar 20 13:42:52 crc kubenswrapper[4895]: I0320 13:42:52.628910 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.628893354 podStartE2EDuration="2.628893354s" podCreationTimestamp="2026-03-20 13:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:52.612533429 +0000 UTC m=+1272.122252405" watchObservedRunningTime="2026-03-20 13:42:52.628893354 +0000 UTC m=+1272.138612320" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.539558 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.583846 4895 generic.go:334] "Generic (PLEG): container finished" podID="489741aa-97a9-4f41-b94b-7e49b07a27f1" containerID="8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e" exitCode=0 Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.583922 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"489741aa-97a9-4f41-b94b-7e49b07a27f1","Type":"ContainerDied","Data":"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e"} Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.583954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"489741aa-97a9-4f41-b94b-7e49b07a27f1","Type":"ContainerDied","Data":"e28e47c38b270d664f6a2532bfd6bae4d6b1c0d031001f4bbb8118d5d9e864aa"} Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.583978 4895 scope.go:117] "RemoveContainer" containerID="8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.584119 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.587772 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7c1547aa-5cfe-4a70-bd45-0d06101f0e74","Type":"ContainerStarted","Data":"718a1d4ff954253202697ad88509257bfa8b4d2f31972279f164c3f7a358e2dd"} Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.629585 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.629570359 podStartE2EDuration="3.629570359s" podCreationTimestamp="2026-03-20 13:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:53.628001153 +0000 UTC m=+1273.137720139" watchObservedRunningTime="2026-03-20 13:42:53.629570359 +0000 UTC m=+1273.139289325" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.632203 4895 scope.go:117] "RemoveContainer" containerID="8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e" Mar 20 13:42:53 crc kubenswrapper[4895]: E0320 13:42:53.634631 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e\": container with ID starting with 8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e not found: ID does not exist" containerID="8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.634680 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e"} err="failed to get container status \"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e\": rpc error: code = NotFound desc = could not find container \"8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e\": container with ID starting with 8c566ada18cf1825cb8d485347bad4728cc8109ba8052150f21b626076df9e5e not found: ID does not exist" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635259 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635346 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ctw8\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635441 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635474 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635554 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.635683 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts\") pod \"489741aa-97a9-4f41-b94b-7e49b07a27f1\" (UID: \"489741aa-97a9-4f41-b94b-7e49b07a27f1\") " Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.648096 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8" (OuterVolumeSpecName: "kube-api-access-7ctw8") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "kube-api-access-7ctw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.655772 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts" (OuterVolumeSpecName: "scripts") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.655846 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.662271 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs" (OuterVolumeSpecName: "certs") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.673241 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.681502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data" (OuterVolumeSpecName: "config-data") pod "489741aa-97a9-4f41-b94b-7e49b07a27f1" (UID: "489741aa-97a9-4f41-b94b-7e49b07a27f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738199 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738238 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738253 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ctw8\" (UniqueName: \"kubernetes.io/projected/489741aa-97a9-4f41-b94b-7e49b07a27f1-kube-api-access-7ctw8\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738267 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738279 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.738291 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/489741aa-97a9-4f41-b94b-7e49b07a27f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.921293 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.980567 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.996246 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:53 crc kubenswrapper[4895]: E0320 13:42:53.996769 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489741aa-97a9-4f41-b94b-7e49b07a27f1" containerName="cloudkitty-proc" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.996792 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="489741aa-97a9-4f41-b94b-7e49b07a27f1" containerName="cloudkitty-proc" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.997073 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="489741aa-97a9-4f41-b94b-7e49b07a27f1" containerName="cloudkitty-proc" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.998043 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:53 crc kubenswrapper[4895]: I0320 13:42:53.999761 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.006009 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.044833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.044913 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.044936 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.045000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtrn\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.045031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.045197 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147618 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147686 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147716 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtrn\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147759 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147803 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.147957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.153137 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.153148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.153648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.155357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.156109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.169085 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtrn\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn\") pod \"cloudkitty-proc-0\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.319181 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:42:54 crc kubenswrapper[4895]: W0320 13:42:54.823590 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3ddc604_3d2a_45c7_99ae_2dd92d3d4517.slice/crio-1cbf7b1f9783f7870452cb65669b7801d298c8eaab0630c31ed72f6a3b5dd00a WatchSource:0}: Error finding container 1cbf7b1f9783f7870452cb65669b7801d298c8eaab0630c31ed72f6a3b5dd00a: Status 404 returned error can't find the container with id 1cbf7b1f9783f7870452cb65669b7801d298c8eaab0630c31ed72f6a3b5dd00a Mar 20 13:42:54 crc kubenswrapper[4895]: I0320 13:42:54.824517 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.222979 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489741aa-97a9-4f41-b94b-7e49b07a27f1" path="/var/lib/kubelet/pods/489741aa-97a9-4f41-b94b-7e49b07a27f1/volumes" Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.449826 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.617307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517","Type":"ContainerStarted","Data":"70792aa593d19fda325f57f3d40186af6a6ff83b0fe62091093ba7c5b5349a57"} Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.617348 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517","Type":"ContainerStarted","Data":"1cbf7b1f9783f7870452cb65669b7801d298c8eaab0630c31ed72f6a3b5dd00a"} Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.638234 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.638217325 podStartE2EDuration="2.638217325s" podCreationTimestamp="2026-03-20 13:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:55.632970244 +0000 UTC m=+1275.142689210" watchObservedRunningTime="2026-03-20 13:42:55.638217325 +0000 UTC m=+1275.147936291" Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.866075 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.950985 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:42:55 crc kubenswrapper[4895]: I0320 13:42:55.962855 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.122019 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-65558fd5f5-5tmzj" Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.155947 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.238714 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.243688 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.243912 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="dnsmasq-dns" containerID="cri-o://96da8cbac28b924a7c6a47f234535444b5c297e1b10f8fa93a9e21c08f94d390" gracePeriod=10 Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.358461 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b99d76fbb-c92mx" Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.432796 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.634591 4895 generic.go:334] "Generic (PLEG): container finished" podID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerID="96da8cbac28b924a7c6a47f234535444b5c297e1b10f8fa93a9e21c08f94d390" exitCode=0 Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.634681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" event={"ID":"a31897ee-2067-4a4d-9ecd-c9ed35777f92","Type":"ContainerDied","Data":"96da8cbac28b924a7c6a47f234535444b5c297e1b10f8fa93a9e21c08f94d390"} Mar 20 13:42:56 crc kubenswrapper[4895]: I0320 13:42:56.923136 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.006711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5mts\" (UniqueName: \"kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.006768 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.006810 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.006994 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.007049 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.007868 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.031078 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts" (OuterVolumeSpecName: "kube-api-access-m5mts") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "kube-api-access-m5mts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.089378 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.107895 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.108849 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config" (OuterVolumeSpecName: "config") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.109081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") pod \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\" (UID: \"a31897ee-2067-4a4d-9ecd-c9ed35777f92\") " Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.109518 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.109538 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.109548 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5mts\" (UniqueName: \"kubernetes.io/projected/a31897ee-2067-4a4d-9ecd-c9ed35777f92-kube-api-access-m5mts\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: W0320 13:42:57.109635 4895 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a31897ee-2067-4a4d-9ecd-c9ed35777f92/volumes/kubernetes.io~configmap/config Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.109647 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config" (OuterVolumeSpecName: "config") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.116193 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.116772 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a31897ee-2067-4a4d-9ecd-c9ed35777f92" (UID: "a31897ee-2067-4a4d-9ecd-c9ed35777f92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.211467 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.211823 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.211965 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31897ee-2067-4a4d-9ecd-c9ed35777f92-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.646260 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.646318 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-bxj4g" event={"ID":"a31897ee-2067-4a4d-9ecd-c9ed35777f92","Type":"ContainerDied","Data":"e174a1beace5c4093b2b410210b3ee303b9f6a45e8cab8edc1c3d33aeb75cf4c"} Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.646364 4895 scope.go:117] "RemoveContainer" containerID="96da8cbac28b924a7c6a47f234535444b5c297e1b10f8fa93a9e21c08f94d390" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.646378 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-85d989d55b-spf52" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-log" containerID="cri-o://9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60" gracePeriod=30 Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.646558 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-85d989d55b-spf52" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-api" containerID="cri-o://d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9" gracePeriod=30 Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.676570 4895 scope.go:117] "RemoveContainer" containerID="265358b81e0e820dbb2aa1dc39d10ea13e0b7e36035b4cec890973eaf7caa307" Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.679465 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:57 crc kubenswrapper[4895]: I0320 13:42:57.697611 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-bxj4g"] Mar 20 13:42:58 crc kubenswrapper[4895]: I0320 13:42:58.656225 4895 generic.go:334] "Generic (PLEG): container finished" podID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerID="9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60" exitCode=143 Mar 20 13:42:58 crc kubenswrapper[4895]: I0320 13:42:58.656291 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerDied","Data":"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60"} Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.230891 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" path="/var/lib/kubelet/pods/a31897ee-2067-4a4d-9ecd-c9ed35777f92/volumes" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.766422 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:42:59 crc kubenswrapper[4895]: E0320 13:42:59.767142 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="init" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.767158 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="init" Mar 20 13:42:59 crc kubenswrapper[4895]: E0320 13:42:59.767178 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="dnsmasq-dns" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.767186 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="dnsmasq-dns" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.767446 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a31897ee-2067-4a4d-9ecd-c9ed35777f92" containerName="dnsmasq-dns" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.768181 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.770926 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.771211 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pzlwn" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.771560 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.786380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config-secret\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.786674 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx6pg\" (UniqueName: \"kubernetes.io/projected/c52a1a0f-5544-4b98-8746-4bb3d7066c87-kube-api-access-cx6pg\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.786740 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.786896 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.798814 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.888520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config-secret\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.888626 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx6pg\" (UniqueName: \"kubernetes.io/projected/c52a1a0f-5544-4b98-8746-4bb3d7066c87-kube-api-access-cx6pg\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.888653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.888706 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.889612 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.894996 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-openstack-config-secret\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.902498 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52a1a0f-5544-4b98-8746-4bb3d7066c87-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:42:59 crc kubenswrapper[4895]: I0320 13:42:59.904601 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx6pg\" (UniqueName: \"kubernetes.io/projected/c52a1a0f-5544-4b98-8746-4bb3d7066c87-kube-api-access-cx6pg\") pod \"openstackclient\" (UID: \"c52a1a0f-5544-4b98-8746-4bb3d7066c87\") " pod="openstack/openstackclient" Mar 20 13:43:00 crc kubenswrapper[4895]: I0320 13:43:00.106118 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:43:00 crc kubenswrapper[4895]: I0320 13:43:00.604127 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:43:00 crc kubenswrapper[4895]: I0320 13:43:00.680459 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c52a1a0f-5544-4b98-8746-4bb3d7066c87","Type":"ContainerStarted","Data":"6844db6506ecef6b20ca8c39a99ec6492bfb2431ea2002685a7f5a8fb71b77b7"} Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.232498 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.392477 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424267 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4v4j\" (UniqueName: \"kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424405 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424563 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424587 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424798 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.424849 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs\") pod \"59f6cca3-9663-457d-b54d-21e2a1888aeb\" (UID: \"59f6cca3-9663-457d-b54d-21e2a1888aeb\") " Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.426048 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs" (OuterVolumeSpecName: "logs") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.464715 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j" (OuterVolumeSpecName: "kube-api-access-d4v4j") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "kube-api-access-d4v4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.470528 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts" (OuterVolumeSpecName: "scripts") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.512805 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.528194 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.528228 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6cca3-9663-457d-b54d-21e2a1888aeb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.528238 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4v4j\" (UniqueName: \"kubernetes.io/projected/59f6cca3-9663-457d-b54d-21e2a1888aeb-kube-api-access-d4v4j\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.528250 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.531514 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data" (OuterVolumeSpecName: "config-data") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.561135 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.612212 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59f6cca3-9663-457d-b54d-21e2a1888aeb" (UID: "59f6cca3-9663-457d-b54d-21e2a1888aeb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.629881 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.630137 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.630459 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6cca3-9663-457d-b54d-21e2a1888aeb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.692903 4895 generic.go:334] "Generic (PLEG): container finished" podID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerID="d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9" exitCode=0 Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.692955 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerDied","Data":"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9"} Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.692980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85d989d55b-spf52" event={"ID":"59f6cca3-9663-457d-b54d-21e2a1888aeb","Type":"ContainerDied","Data":"949dc6cb2ae0ab3721265ac06f55535c1b449a1b61723f0889b56edc8f446e3f"} Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.692985 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85d989d55b-spf52" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.692996 4895 scope.go:117] "RemoveContainer" containerID="d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.739379 4895 scope.go:117] "RemoveContainer" containerID="9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.743806 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.752788 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-85d989d55b-spf52"] Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.775850 4895 scope.go:117] "RemoveContainer" containerID="d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9" Mar 20 13:43:01 crc kubenswrapper[4895]: E0320 13:43:01.776246 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9\": container with ID starting with d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9 not found: ID does not exist" containerID="d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.776295 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9"} err="failed to get container status \"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9\": rpc error: code = NotFound desc = could not find container \"d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9\": container with ID starting with d67c19633b26322b52a8a1eca8a5ab08f49f7b5fcb7e8487601f013245be73d9 not found: ID does not exist" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.776317 4895 scope.go:117] "RemoveContainer" containerID="9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60" Mar 20 13:43:01 crc kubenswrapper[4895]: E0320 13:43:01.776605 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60\": container with ID starting with 9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60 not found: ID does not exist" containerID="9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60" Mar 20 13:43:01 crc kubenswrapper[4895]: I0320 13:43:01.776629 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60"} err="failed to get container status \"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60\": rpc error: code = NotFound desc = could not find container \"9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60\": container with ID starting with 9a6417c8855c1c5a4af3dfaaf0d42efbe65380e8bd3982b965324ecbb5023f60 not found: ID does not exist" Mar 20 13:43:03 crc kubenswrapper[4895]: I0320 13:43:03.224190 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" path="/var/lib/kubelet/pods/59f6cca3-9663-457d-b54d-21e2a1888aeb/volumes" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.218692 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d4b947447-mjwnr"] Mar 20 13:43:04 crc kubenswrapper[4895]: E0320 13:43:04.219170 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-log" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.219191 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-log" Mar 20 13:43:04 crc kubenswrapper[4895]: E0320 13:43:04.219220 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-api" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.219227 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-api" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.219415 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-log" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.219436 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f6cca3-9663-457d-b54d-21e2a1888aeb" containerName="placement-api" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.220547 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.223596 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.223672 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.223781 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.251479 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d4b947447-mjwnr"] Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.300355 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-run-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.300435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-etc-swift\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.300485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgls\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-kube-api-access-ftgls\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.301855 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-config-data\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.301928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-log-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.301950 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-combined-ca-bundle\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.302044 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-public-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.302078 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-internal-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.403948 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-config-data\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404073 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-log-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-combined-ca-bundle\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-public-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404192 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-internal-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404224 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-run-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404249 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-etc-swift\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.404642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-log-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.405279 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgls\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-kube-api-access-ftgls\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.407169 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/055085bc-2288-49cd-b07f-28747f5a6458-run-httpd\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.412539 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-public-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.412630 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-etc-swift\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.413203 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-combined-ca-bundle\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.413623 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-config-data\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.426788 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/055085bc-2288-49cd-b07f-28747f5a6458-internal-tls-certs\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.426925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgls\" (UniqueName: \"kubernetes.io/projected/055085bc-2288-49cd-b07f-28747f5a6458-kube-api-access-ftgls\") pod \"swift-proxy-d4b947447-mjwnr\" (UID: \"055085bc-2288-49cd-b07f-28747f5a6458\") " pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:04 crc kubenswrapper[4895]: I0320 13:43:04.538123 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.154105 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d4b947447-mjwnr"] Mar 20 13:43:05 crc kubenswrapper[4895]: W0320 13:43:05.158710 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod055085bc_2288_49cd_b07f_28747f5a6458.slice/crio-ba1b600b8624d61c388d311f44385797bfb340e03a1950842855c02dfb626fdd WatchSource:0}: Error finding container ba1b600b8624d61c388d311f44385797bfb340e03a1950842855c02dfb626fdd: Status 404 returned error can't find the container with id ba1b600b8624d61c388d311f44385797bfb340e03a1950842855c02dfb626fdd Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.576508 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.577004 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-central-agent" containerID="cri-o://958b04dbda82d360e516f188cdccab8ea6bfcabb6915f55023e4c32671bce962" gracePeriod=30 Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.580343 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="sg-core" containerID="cri-o://0ef231dc8ab5097dac9c38d8ae108baf23e5d5d5910e5b917136b6dc2f81d191" gracePeriod=30 Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.580485 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="proxy-httpd" containerID="cri-o://6a31c32c86d08b7269edf1a12637c7c02f606ddf536e7551a2815a2112193001" gracePeriod=30 Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.580542 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-notification-agent" containerID="cri-o://7146ecf46b7155747607ccb8d57de7b0f88296f62062a56ff047cdf1a3b0ee6f" gracePeriod=30 Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.764816 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.195:3000/\": EOF" Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.796789 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b947447-mjwnr" event={"ID":"055085bc-2288-49cd-b07f-28747f5a6458","Type":"ContainerStarted","Data":"4d7baa1f4f7036cbc7c3856ee2d433f8b00d9b9e765939eb548844248acb8c3e"} Mar 20 13:43:05 crc kubenswrapper[4895]: I0320 13:43:05.796833 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b947447-mjwnr" event={"ID":"055085bc-2288-49cd-b07f-28747f5a6458","Type":"ContainerStarted","Data":"ba1b600b8624d61c388d311f44385797bfb340e03a1950842855c02dfb626fdd"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.480639 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56bf665d85-xzq8s" Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.548514 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.548865 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6547c6468-fs8ld" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-api" containerID="cri-o://5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536" gracePeriod=30 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.549306 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6547c6468-fs8ld" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-httpd" containerID="cri-o://cd6786ff3e657bd26bbbf8b98f58c9cf972897524a74bdbaa3a483ccaa8af59e" gracePeriod=30 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825085 4895 generic.go:334] "Generic (PLEG): container finished" podID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerID="6a31c32c86d08b7269edf1a12637c7c02f606ddf536e7551a2815a2112193001" exitCode=0 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerDied","Data":"6a31c32c86d08b7269edf1a12637c7c02f606ddf536e7551a2815a2112193001"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825443 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerDied","Data":"0ef231dc8ab5097dac9c38d8ae108baf23e5d5d5910e5b917136b6dc2f81d191"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825408 4895 generic.go:334] "Generic (PLEG): container finished" podID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerID="0ef231dc8ab5097dac9c38d8ae108baf23e5d5d5910e5b917136b6dc2f81d191" exitCode=2 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825467 4895 generic.go:334] "Generic (PLEG): container finished" podID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerID="7146ecf46b7155747607ccb8d57de7b0f88296f62062a56ff047cdf1a3b0ee6f" exitCode=0 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825483 4895 generic.go:334] "Generic (PLEG): container finished" podID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerID="958b04dbda82d360e516f188cdccab8ea6bfcabb6915f55023e4c32671bce962" exitCode=0 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825565 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerDied","Data":"7146ecf46b7155747607ccb8d57de7b0f88296f62062a56ff047cdf1a3b0ee6f"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.825576 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerDied","Data":"958b04dbda82d360e516f188cdccab8ea6bfcabb6915f55023e4c32671bce962"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.827889 4895 generic.go:334] "Generic (PLEG): container finished" podID="402d2058-787a-48d5-afb4-7f54fbf42121" containerID="cd6786ff3e657bd26bbbf8b98f58c9cf972897524a74bdbaa3a483ccaa8af59e" exitCode=0 Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.827948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerDied","Data":"cd6786ff3e657bd26bbbf8b98f58c9cf972897524a74bdbaa3a483ccaa8af59e"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.839263 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d4b947447-mjwnr" event={"ID":"055085bc-2288-49cd-b07f-28747f5a6458","Type":"ContainerStarted","Data":"309cff8e9418eb9b26d578cb92aa72ad78e480214b275df9ad9090d27a554996"} Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.839641 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.839689 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:06 crc kubenswrapper[4895]: I0320 13:43:06.865275 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d4b947447-mjwnr" podStartSLOduration=2.865261542 podStartE2EDuration="2.865261542s" podCreationTimestamp="2026-03-20 13:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:06.86167903 +0000 UTC m=+1286.371397996" watchObservedRunningTime="2026-03-20 13:43:06.865261542 +0000 UTC m=+1286.374980508" Mar 20 13:43:10 crc kubenswrapper[4895]: I0320 13:43:10.927058 4895 scope.go:117] "RemoveContainer" containerID="02d2d77999792227a994e2c682e8c062dbb0f08795c20d2ad015d9d89d1efec9" Mar 20 13:43:11 crc kubenswrapper[4895]: E0320 13:43:11.400576 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402d2058_787a_48d5_afb4_7f54fbf42121.slice/crio-conmon-5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402d2058_787a_48d5_afb4_7f54fbf42121.slice/crio-5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:43:11 crc kubenswrapper[4895]: I0320 13:43:11.917882 4895 generic.go:334] "Generic (PLEG): container finished" podID="402d2058-787a-48d5-afb4-7f54fbf42121" containerID="5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536" exitCode=0 Mar 20 13:43:11 crc kubenswrapper[4895]: I0320 13:43:11.918152 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerDied","Data":"5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536"} Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.420659 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.484119 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.521805 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522213 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522334 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522597 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs\") pod \"402d2058-787a-48d5-afb4-7f54fbf42121\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522692 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config\") pod \"402d2058-787a-48d5-afb4-7f54fbf42121\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522830 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522945 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fddq5\" (UniqueName: \"kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5\") pod \"402d2058-787a-48d5-afb4-7f54fbf42121\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.523090 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config\") pod \"402d2058-787a-48d5-afb4-7f54fbf42121\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.523249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.523344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql88g\" (UniqueName: \"kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g\") pod \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\" (UID: \"75e42d1f-8ed4-4023-8d1f-588b49ba2f56\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.523456 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle\") pod \"402d2058-787a-48d5-afb4-7f54fbf42121\" (UID: \"402d2058-787a-48d5-afb4-7f54fbf42121\") " Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.522959 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.528576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts" (OuterVolumeSpecName: "scripts") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.528956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.536507 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "402d2058-787a-48d5-afb4-7f54fbf42121" (UID: "402d2058-787a-48d5-afb4-7f54fbf42121"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.539191 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5" (OuterVolumeSpecName: "kube-api-access-fddq5") pod "402d2058-787a-48d5-afb4-7f54fbf42121" (UID: "402d2058-787a-48d5-afb4-7f54fbf42121"). InnerVolumeSpecName "kube-api-access-fddq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.543588 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g" (OuterVolumeSpecName: "kube-api-access-ql88g") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "kube-api-access-ql88g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.578613 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.606283 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config" (OuterVolumeSpecName: "config") pod "402d2058-787a-48d5-afb4-7f54fbf42121" (UID: "402d2058-787a-48d5-afb4-7f54fbf42121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.621213 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "402d2058-787a-48d5-afb4-7f54fbf42121" (UID: "402d2058-787a-48d5-afb4-7f54fbf42121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.623572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "402d2058-787a-48d5-afb4-7f54fbf42121" (UID: "402d2058-787a-48d5-afb4-7f54fbf42121"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.625766 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.625887 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.625946 4895 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626518 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626600 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626669 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fddq5\" (UniqueName: \"kubernetes.io/projected/402d2058-787a-48d5-afb4-7f54fbf42121-kube-api-access-fddq5\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626817 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626877 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql88g\" (UniqueName: \"kubernetes.io/projected/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-kube-api-access-ql88g\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626931 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402d2058-787a-48d5-afb4-7f54fbf42121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.626984 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.658805 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data" (OuterVolumeSpecName: "config-data") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.665485 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75e42d1f-8ed4-4023-8d1f-588b49ba2f56" (UID: "75e42d1f-8ed4-4023-8d1f-588b49ba2f56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.728276 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.728308 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e42d1f-8ed4-4023-8d1f-588b49ba2f56-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.946977 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6547c6468-fs8ld" event={"ID":"402d2058-787a-48d5-afb4-7f54fbf42121","Type":"ContainerDied","Data":"f5b65a01fd153654df673706441529a9b4bc54793b01cb1a00cb7d4a41dacf44"} Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.947311 4895 scope.go:117] "RemoveContainer" containerID="cd6786ff3e657bd26bbbf8b98f58c9cf972897524a74bdbaa3a483ccaa8af59e" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.948086 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6547c6468-fs8ld" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.957948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e42d1f-8ed4-4023-8d1f-588b49ba2f56","Type":"ContainerDied","Data":"7ee4c1e11df7dac6889d15329b2f3a5f2f2034f42aaad505c46604a17e86e0ac"} Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.958071 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.970594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c52a1a0f-5544-4b98-8746-4bb3d7066c87","Type":"ContainerStarted","Data":"f6f729a2b54dc5bacc5683cdedbb8903e1d4f5ca7855ce982b643c4e9758b162"} Mar 20 13:43:13 crc kubenswrapper[4895]: I0320 13:43:13.994135 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.5086453349999998 podStartE2EDuration="14.994114905s" podCreationTimestamp="2026-03-20 13:42:59 +0000 UTC" firstStartedPulling="2026-03-20 13:43:00.605879191 +0000 UTC m=+1280.115598157" lastFinishedPulling="2026-03-20 13:43:13.091348761 +0000 UTC m=+1292.601067727" observedRunningTime="2026-03-20 13:43:13.985054818 +0000 UTC m=+1293.494773784" watchObservedRunningTime="2026-03-20 13:43:13.994114905 +0000 UTC m=+1293.503833871" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.048199 4895 scope.go:117] "RemoveContainer" containerID="5f9834995622652f165234957453b476a1109dc323efc22ad4380bb371eba536" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.064210 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.076884 4895 scope.go:117] "RemoveContainer" containerID="6a31c32c86d08b7269edf1a12637c7c02f606ddf536e7551a2815a2112193001" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.096113 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6547c6468-fs8ld"] Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.113031 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.122689 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.134892 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135584 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-central-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135608 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-central-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135643 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="proxy-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135656 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="proxy-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135692 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-notification-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135704 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-notification-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135724 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="sg-core" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135735 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="sg-core" Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135757 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-api" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135768 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-api" Mar 20 13:43:14 crc kubenswrapper[4895]: E0320 13:43:14.135791 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.135801 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136107 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="proxy-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136140 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-notification-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136165 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-httpd" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136199 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" containerName="neutron-api" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136220 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="ceilometer-central-agent" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.136238 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" containerName="sg-core" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.139820 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.140065 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.141598 4895 scope.go:117] "RemoveContainer" containerID="0ef231dc8ab5097dac9c38d8ae108baf23e5d5d5910e5b917136b6dc2f81d191" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.144853 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.145256 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.180599 4895 scope.go:117] "RemoveContainer" containerID="7146ecf46b7155747607ccb8d57de7b0f88296f62062a56ff047cdf1a3b0ee6f" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.220855 4895 scope.go:117] "RemoveContainer" containerID="958b04dbda82d360e516f188cdccab8ea6bfcabb6915f55023e4c32671bce962" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246099 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxfrj\" (UniqueName: \"kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246152 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246267 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246283 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246298 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.246381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347596 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347654 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347811 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxfrj\" (UniqueName: \"kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.347897 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.348700 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.348818 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.353804 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.354140 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.354860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.355178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.364939 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxfrj\" (UniqueName: \"kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj\") pod \"ceilometer-0\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.471298 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.545427 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:14 crc kubenswrapper[4895]: I0320 13:43:14.562380 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d4b947447-mjwnr" Mar 20 13:43:15 crc kubenswrapper[4895]: I0320 13:43:15.004274 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:15 crc kubenswrapper[4895]: I0320 13:43:15.223096 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402d2058-787a-48d5-afb4-7f54fbf42121" path="/var/lib/kubelet/pods/402d2058-787a-48d5-afb4-7f54fbf42121/volumes" Mar 20 13:43:15 crc kubenswrapper[4895]: I0320 13:43:15.224208 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e42d1f-8ed4-4023-8d1f-588b49ba2f56" path="/var/lib/kubelet/pods/75e42d1f-8ed4-4023-8d1f-588b49ba2f56/volumes" Mar 20 13:43:15 crc kubenswrapper[4895]: I0320 13:43:15.991970 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerStarted","Data":"b53cb26e1148c79bb7b061a6044572898936997f183aa4d01ea75410fd09b6cc"} Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.003633 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerStarted","Data":"e850a33656701d162db75c000ab3002f1a862e55b1380a1b1e43ee7ce3ca1160"} Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.004204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerStarted","Data":"8a4d0881e7dc2a317e857493d749429190ac8532cedabaf742fb8816324549f5"} Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.058955 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-trcpf"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.060690 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.074069 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trcpf"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.101946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz7s\" (UniqueName: \"kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.102065 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.153135 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rg8bj"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.154561 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.162738 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rg8bj"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.203942 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz7s\" (UniqueName: \"kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.204009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs7m7\" (UniqueName: \"kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.204071 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.204113 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.204997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.259732 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz7s\" (UniqueName: \"kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s\") pod \"nova-api-db-create-trcpf\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.286458 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-568f-account-create-update-cbjxb"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.287808 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.289612 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.307524 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.307646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs7m7\" (UniqueName: \"kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.309552 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.313161 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-568f-account-create-update-cbjxb"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.332223 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs7m7\" (UniqueName: \"kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7\") pod \"nova-cell0-db-create-rg8bj\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.390096 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.409702 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbd7\" (UniqueName: \"kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.410321 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.469775 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.473635 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-67bsv"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.474938 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.511848 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-571b-account-create-update-5b4mj"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.513195 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.513200 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbd7\" (UniqueName: \"kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.513697 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k4qb\" (UniqueName: \"kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.513776 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.513895 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.514628 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.516575 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.541054 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbd7\" (UniqueName: \"kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7\") pod \"nova-api-568f-account-create-update-cbjxb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.545137 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-67bsv"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.557451 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-571b-account-create-update-5b4mj"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.603510 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8bea-account-create-update-xt24z"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.605193 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.610331 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.612323 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8bea-account-create-update-xt24z"] Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.619627 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.620460 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d55l8\" (UniqueName: \"kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.620513 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.620630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.620720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k4qb\" (UniqueName: \"kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.621669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.644172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k4qb\" (UniqueName: \"kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb\") pod \"nova-cell1-db-create-67bsv\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.721998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.722157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4rb\" (UniqueName: \"kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.722203 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d55l8\" (UniqueName: \"kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.722231 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.723044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.753023 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d55l8\" (UniqueName: \"kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8\") pod \"nova-cell0-571b-account-create-update-5b4mj\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.792215 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.824337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4rb\" (UniqueName: \"kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.824489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.825381 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.842931 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.848897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4rb\" (UniqueName: \"kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb\") pod \"nova-cell1-8bea-account-create-update-xt24z\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:17 crc kubenswrapper[4895]: I0320 13:43:17.939161 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.051598 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerStarted","Data":"4d9f056665c4ff9f7d17d72e292ac1aceb8ff35ffb407868fce50f685b4a121a"} Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.117054 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rg8bj"] Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.251934 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trcpf"] Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.420313 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-568f-account-create-update-cbjxb"] Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.559664 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-571b-account-create-update-5b4mj"] Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.576800 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-67bsv"] Mar 20 13:43:18 crc kubenswrapper[4895]: W0320 13:43:18.577035 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66fce8ef_22cc_4aa4_977b_8cf7382053d5.slice/crio-b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c WatchSource:0}: Error finding container b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c: Status 404 returned error can't find the container with id b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c Mar 20 13:43:18 crc kubenswrapper[4895]: I0320 13:43:18.590813 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8bea-account-create-update-xt24z"] Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.063649 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" event={"ID":"4b287388-cee5-4065-9b38-56633ce573c2","Type":"ContainerStarted","Data":"46ab3d474906c44030666edb47b862d09dd86959e99837e69a41520f215ad64b"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.065698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-67bsv" event={"ID":"66fce8ef-22cc-4aa4-977b-8cf7382053d5","Type":"ContainerStarted","Data":"b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.067186 4895 generic.go:334] "Generic (PLEG): container finished" podID="95a123ea-8352-432e-af35-f2b275d1dbbb" containerID="2ca0045f3b36bbb500f5c3da216cce47c61028ede373a9f162d0d442b21ec387" exitCode=0 Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.067231 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-568f-account-create-update-cbjxb" event={"ID":"95a123ea-8352-432e-af35-f2b275d1dbbb","Type":"ContainerDied","Data":"2ca0045f3b36bbb500f5c3da216cce47c61028ede373a9f162d0d442b21ec387"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.067246 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-568f-account-create-update-cbjxb" event={"ID":"95a123ea-8352-432e-af35-f2b275d1dbbb","Type":"ContainerStarted","Data":"623a0ffe74ad4f4063e17cee206b8f57f87196fb9c0f9fc67ad153855cb74346"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.071569 4895 generic.go:334] "Generic (PLEG): container finished" podID="f1128d7a-df8d-4255-af0a-4ed8058e8fa4" containerID="19fd0a9821c640ce869017aa8d909f86de2f344e1f1c8ba94cde5d7fd88f7b3d" exitCode=0 Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.071630 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trcpf" event={"ID":"f1128d7a-df8d-4255-af0a-4ed8058e8fa4","Type":"ContainerDied","Data":"19fd0a9821c640ce869017aa8d909f86de2f344e1f1c8ba94cde5d7fd88f7b3d"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.071647 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trcpf" event={"ID":"f1128d7a-df8d-4255-af0a-4ed8058e8fa4","Type":"ContainerStarted","Data":"fc0c4ed462c2f0836870644d3f08a5a7dcc428cf62f3c0a3471144c9a3b5a2b8"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.073476 4895 generic.go:334] "Generic (PLEG): container finished" podID="228bc5a9-17f9-4434-aca9-685d14b15c62" containerID="f2e822bad642583c2e2023cb57578e9ac5b7b26c1ccf56b2be7d83cbf0ae5bc8" exitCode=0 Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.073580 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rg8bj" event={"ID":"228bc5a9-17f9-4434-aca9-685d14b15c62","Type":"ContainerDied","Data":"f2e822bad642583c2e2023cb57578e9ac5b7b26c1ccf56b2be7d83cbf0ae5bc8"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.073636 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rg8bj" event={"ID":"228bc5a9-17f9-4434-aca9-685d14b15c62","Type":"ContainerStarted","Data":"d935bef99037460ab3e65c5f3c0b6cd94361de1b755ba97b92f53a9a0b629b8f"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.075025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" event={"ID":"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90","Type":"ContainerStarted","Data":"223a4adb81555c05ddc3c13185382400e24a7c287c4308a1a9a312a683b54223"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.075052 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" event={"ID":"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90","Type":"ContainerStarted","Data":"cc425054cefd5c676fb406ce7ae3b65ed88f45a0c74db771cb068739d1ba06cf"} Mar 20 13:43:19 crc kubenswrapper[4895]: I0320 13:43:19.103678 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" podStartSLOduration=2.103660595 podStartE2EDuration="2.103660595s" podCreationTimestamp="2026-03-20 13:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:19.099348139 +0000 UTC m=+1298.609067105" watchObservedRunningTime="2026-03-20 13:43:19.103660595 +0000 UTC m=+1298.613379561" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.088541 4895 generic.go:334] "Generic (PLEG): container finished" podID="795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" containerID="223a4adb81555c05ddc3c13185382400e24a7c287c4308a1a9a312a683b54223" exitCode=0 Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.088646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" event={"ID":"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90","Type":"ContainerDied","Data":"223a4adb81555c05ddc3c13185382400e24a7c287c4308a1a9a312a683b54223"} Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.090673 4895 generic.go:334] "Generic (PLEG): container finished" podID="4b287388-cee5-4065-9b38-56633ce573c2" containerID="b3dfb5003971363de16e8c4e190acfd80815d27ea5d973767348b69576c43919" exitCode=0 Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.090748 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" event={"ID":"4b287388-cee5-4065-9b38-56633ce573c2","Type":"ContainerDied","Data":"b3dfb5003971363de16e8c4e190acfd80815d27ea5d973767348b69576c43919"} Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.096555 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerStarted","Data":"e6538945b8467358fd3380b527f24b0ab1c04855501067af81d37d5d3307e2d9"} Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.097267 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.113063 4895 generic.go:334] "Generic (PLEG): container finished" podID="66fce8ef-22cc-4aa4-977b-8cf7382053d5" containerID="0dbc1613fa99afe295156edac6f37b60876898bbb5cdcc85937fc7b877a669a4" exitCode=0 Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.113270 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-67bsv" event={"ID":"66fce8ef-22cc-4aa4-977b-8cf7382053d5","Type":"ContainerDied","Data":"0dbc1613fa99afe295156edac6f37b60876898bbb5cdcc85937fc7b877a669a4"} Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.204325 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.868583948 podStartE2EDuration="6.204308259s" podCreationTimestamp="2026-03-20 13:43:14 +0000 UTC" firstStartedPulling="2026-03-20 13:43:15.004588354 +0000 UTC m=+1294.514307330" lastFinishedPulling="2026-03-20 13:43:19.340312675 +0000 UTC m=+1298.850031641" observedRunningTime="2026-03-20 13:43:20.199921121 +0000 UTC m=+1299.709640087" watchObservedRunningTime="2026-03-20 13:43:20.204308259 +0000 UTC m=+1299.714027225" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.884510 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.890162 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.898100 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.908330 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs7m7\" (UniqueName: \"kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7\") pod \"228bc5a9-17f9-4434-aca9-685d14b15c62\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.908703 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts\") pod \"228bc5a9-17f9-4434-aca9-685d14b15c62\" (UID: \"228bc5a9-17f9-4434-aca9-685d14b15c62\") " Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.909887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "228bc5a9-17f9-4434-aca9-685d14b15c62" (UID: "228bc5a9-17f9-4434-aca9-685d14b15c62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:20 crc kubenswrapper[4895]: I0320 13:43:20.920867 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7" (OuterVolumeSpecName: "kube-api-access-vs7m7") pod "228bc5a9-17f9-4434-aca9-685d14b15c62" (UID: "228bc5a9-17f9-4434-aca9-685d14b15c62"). InnerVolumeSpecName "kube-api-access-vs7m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.010890 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzbd7\" (UniqueName: \"kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7\") pod \"95a123ea-8352-432e-af35-f2b275d1dbbb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.011145 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts\") pod \"95a123ea-8352-432e-af35-f2b275d1dbbb\" (UID: \"95a123ea-8352-432e-af35-f2b275d1dbbb\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.011171 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts\") pod \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.011199 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkz7s\" (UniqueName: \"kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s\") pod \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\" (UID: \"f1128d7a-df8d-4255-af0a-4ed8058e8fa4\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.011913 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/228bc5a9-17f9-4434-aca9-685d14b15c62-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.011936 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs7m7\" (UniqueName: \"kubernetes.io/projected/228bc5a9-17f9-4434-aca9-685d14b15c62-kube-api-access-vs7m7\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.012320 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95a123ea-8352-432e-af35-f2b275d1dbbb" (UID: "95a123ea-8352-432e-af35-f2b275d1dbbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.012437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1128d7a-df8d-4255-af0a-4ed8058e8fa4" (UID: "f1128d7a-df8d-4255-af0a-4ed8058e8fa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.014775 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s" (OuterVolumeSpecName: "kube-api-access-kkz7s") pod "f1128d7a-df8d-4255-af0a-4ed8058e8fa4" (UID: "f1128d7a-df8d-4255-af0a-4ed8058e8fa4"). InnerVolumeSpecName "kube-api-access-kkz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.018552 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7" (OuterVolumeSpecName: "kube-api-access-gzbd7") pod "95a123ea-8352-432e-af35-f2b275d1dbbb" (UID: "95a123ea-8352-432e-af35-f2b275d1dbbb"). InnerVolumeSpecName "kube-api-access-gzbd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.113465 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a123ea-8352-432e-af35-f2b275d1dbbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.113512 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.113522 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkz7s\" (UniqueName: \"kubernetes.io/projected/f1128d7a-df8d-4255-af0a-4ed8058e8fa4-kube-api-access-kkz7s\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.113532 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzbd7\" (UniqueName: \"kubernetes.io/projected/95a123ea-8352-432e-af35-f2b275d1dbbb-kube-api-access-gzbd7\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.126178 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trcpf" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.126663 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trcpf" event={"ID":"f1128d7a-df8d-4255-af0a-4ed8058e8fa4","Type":"ContainerDied","Data":"fc0c4ed462c2f0836870644d3f08a5a7dcc428cf62f3c0a3471144c9a3b5a2b8"} Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.126719 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc0c4ed462c2f0836870644d3f08a5a7dcc428cf62f3c0a3471144c9a3b5a2b8" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.137723 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rg8bj" event={"ID":"228bc5a9-17f9-4434-aca9-685d14b15c62","Type":"ContainerDied","Data":"d935bef99037460ab3e65c5f3c0b6cd94361de1b755ba97b92f53a9a0b629b8f"} Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.137768 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d935bef99037460ab3e65c5f3c0b6cd94361de1b755ba97b92f53a9a0b629b8f" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.137876 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rg8bj" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.142089 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-568f-account-create-update-cbjxb" event={"ID":"95a123ea-8352-432e-af35-f2b275d1dbbb","Type":"ContainerDied","Data":"623a0ffe74ad4f4063e17cee206b8f57f87196fb9c0f9fc67ad153855cb74346"} Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.142137 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623a0ffe74ad4f4063e17cee206b8f57f87196fb9c0f9fc67ad153855cb74346" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.143813 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-568f-account-create-update-cbjxb" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.309600 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.310159 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-log" containerID="cri-o://0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54" gracePeriod=30 Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.310509 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-httpd" containerID="cri-o://4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2" gracePeriod=30 Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.443275 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.536130 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k4qb\" (UniqueName: \"kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb\") pod \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.536231 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts\") pod \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\" (UID: \"66fce8ef-22cc-4aa4-977b-8cf7382053d5\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.537716 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66fce8ef-22cc-4aa4-977b-8cf7382053d5" (UID: "66fce8ef-22cc-4aa4-977b-8cf7382053d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.543547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb" (OuterVolumeSpecName: "kube-api-access-7k4qb") pod "66fce8ef-22cc-4aa4-977b-8cf7382053d5" (UID: "66fce8ef-22cc-4aa4-977b-8cf7382053d5"). InnerVolumeSpecName "kube-api-access-7k4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.639688 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k4qb\" (UniqueName: \"kubernetes.io/projected/66fce8ef-22cc-4aa4-977b-8cf7382053d5-kube-api-access-7k4qb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.639716 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66fce8ef-22cc-4aa4-977b-8cf7382053d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.803299 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.819680 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.842806 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d55l8\" (UniqueName: \"kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8\") pod \"4b287388-cee5-4065-9b38-56633ce573c2\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.842978 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts\") pod \"4b287388-cee5-4065-9b38-56633ce573c2\" (UID: \"4b287388-cee5-4065-9b38-56633ce573c2\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.843313 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b287388-cee5-4065-9b38-56633ce573c2" (UID: "4b287388-cee5-4065-9b38-56633ce573c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.843699 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b287388-cee5-4065-9b38-56633ce573c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.849619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8" (OuterVolumeSpecName: "kube-api-access-d55l8") pod "4b287388-cee5-4065-9b38-56633ce573c2" (UID: "4b287388-cee5-4065-9b38-56633ce573c2"). InnerVolumeSpecName "kube-api-access-d55l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.946691 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts\") pod \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.947056 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv4rb\" (UniqueName: \"kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb\") pod \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\" (UID: \"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90\") " Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.947530 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d55l8\" (UniqueName: \"kubernetes.io/projected/4b287388-cee5-4065-9b38-56633ce573c2-kube-api-access-d55l8\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.948519 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" (UID: "795dcfb9-6fb6-42f6-adbf-4e77aef2bd90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:21 crc kubenswrapper[4895]: I0320 13:43:21.963660 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb" (OuterVolumeSpecName: "kube-api-access-kv4rb") pod "795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" (UID: "795dcfb9-6fb6-42f6-adbf-4e77aef2bd90"). InnerVolumeSpecName "kube-api-access-kv4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.049161 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv4rb\" (UniqueName: \"kubernetes.io/projected/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-kube-api-access-kv4rb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.049192 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.153880 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-67bsv" event={"ID":"66fce8ef-22cc-4aa4-977b-8cf7382053d5","Type":"ContainerDied","Data":"b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c"} Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.153919 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f3986f81aea30e6d9069d928c693f0cb1f2255678c697b49241f64cbd1e68c" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.153971 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-67bsv" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.160462 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerID="0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54" exitCode=143 Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.160535 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerDied","Data":"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54"} Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.162502 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" event={"ID":"795dcfb9-6fb6-42f6-adbf-4e77aef2bd90","Type":"ContainerDied","Data":"cc425054cefd5c676fb406ce7ae3b65ed88f45a0c74db771cb068739d1ba06cf"} Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.162540 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc425054cefd5c676fb406ce7ae3b65ed88f45a0c74db771cb068739d1ba06cf" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.162601 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bea-account-create-update-xt24z" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.164478 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" event={"ID":"4b287388-cee5-4065-9b38-56633ce573c2","Type":"ContainerDied","Data":"46ab3d474906c44030666edb47b862d09dd86959e99837e69a41520f215ad64b"} Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.164517 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-571b-account-create-update-5b4mj" Mar 20 13:43:22 crc kubenswrapper[4895]: I0320 13:43:22.164535 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ab3d474906c44030666edb47b862d09dd86959e99837e69a41520f215ad64b" Mar 20 13:43:24 crc kubenswrapper[4895]: I0320 13:43:24.291877 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:24 crc kubenswrapper[4895]: I0320 13:43:24.292640 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-log" containerID="cri-o://30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821" gracePeriod=30 Mar 20 13:43:24 crc kubenswrapper[4895]: I0320 13:43:24.292700 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-httpd" containerID="cri-o://2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed" gracePeriod=30 Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.056081 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.106902 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107003 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107030 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107159 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107244 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w7bg\" (UniqueName: \"kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107283 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data\") pod \"9a17c32e-3090-49b2-ac2a-91572b5eab39\" (UID: \"9a17c32e-3090-49b2-ac2a-91572b5eab39\") " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.107777 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.108124 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs" (OuterVolumeSpecName: "logs") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.108447 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.108476 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a17c32e-3090-49b2-ac2a-91572b5eab39-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.115407 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts" (OuterVolumeSpecName: "scripts") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.148562 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg" (OuterVolumeSpecName: "kube-api-access-7w7bg") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "kube-api-access-7w7bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.184086 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2" (OuterVolumeSpecName: "glance") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.184176 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.195592 4895 generic.go:334] "Generic (PLEG): container finished" podID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerID="30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821" exitCode=143 Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.195643 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.195658 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerDied","Data":"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821"} Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.197985 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerID="4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2" exitCode=0 Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.198012 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerDied","Data":"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2"} Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.198021 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.198063 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a17c32e-3090-49b2-ac2a-91572b5eab39","Type":"ContainerDied","Data":"d7d4b08fcb85851a249860cee28e8e9881e276a6f79ebc5b9a7500dce79e1829"} Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.198081 4895 scope.go:117] "RemoveContainer" containerID="4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.209728 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w7bg\" (UniqueName: \"kubernetes.io/projected/9a17c32e-3090-49b2-ac2a-91572b5eab39-kube-api-access-7w7bg\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.209769 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") on node \"crc\" " Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.209781 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.209789 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.209799 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.214115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data" (OuterVolumeSpecName: "config-data") pod "9a17c32e-3090-49b2-ac2a-91572b5eab39" (UID: "9a17c32e-3090-49b2-ac2a-91572b5eab39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.232209 4895 scope.go:117] "RemoveContainer" containerID="0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.232226 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.232422 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2") on node "crc" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.256953 4895 scope.go:117] "RemoveContainer" containerID="4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.257489 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2\": container with ID starting with 4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2 not found: ID does not exist" containerID="4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.257533 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2"} err="failed to get container status \"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2\": rpc error: code = NotFound desc = could not find container \"4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2\": container with ID starting with 4e57b6c1a748baf4d5e7d11d1d99b969ed7fdf8e2db17106f55bcd4a33951de2 not found: ID does not exist" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.257562 4895 scope.go:117] "RemoveContainer" containerID="0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.257870 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54\": container with ID starting with 0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54 not found: ID does not exist" containerID="0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.257908 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54"} err="failed to get container status \"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54\": rpc error: code = NotFound desc = could not find container \"0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54\": container with ID starting with 0d482799b10513941852be023f4d05fb7eff2a50490cd2832d76c56a6d93df54 not found: ID does not exist" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.311872 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a17c32e-3090-49b2-ac2a-91572b5eab39-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.311905 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.523527 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.535927 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546167 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546636 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546664 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546723 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-httpd" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546732 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-httpd" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546751 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1128d7a-df8d-4255-af0a-4ed8058e8fa4" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546759 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1128d7a-df8d-4255-af0a-4ed8058e8fa4" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546773 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-log" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546780 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-log" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546792 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228bc5a9-17f9-4434-aca9-685d14b15c62" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546799 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="228bc5a9-17f9-4434-aca9-685d14b15c62" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546819 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b287388-cee5-4065-9b38-56633ce573c2" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546827 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b287388-cee5-4065-9b38-56633ce573c2" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546842 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a123ea-8352-432e-af35-f2b275d1dbbb" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546850 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a123ea-8352-432e-af35-f2b275d1dbbb" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: E0320 13:43:25.546864 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fce8ef-22cc-4aa4-977b-8cf7382053d5" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.546870 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fce8ef-22cc-4aa4-977b-8cf7382053d5" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547101 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-httpd" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547117 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1128d7a-df8d-4255-af0a-4ed8058e8fa4" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547135 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" containerName="glance-log" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547152 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="228bc5a9-17f9-4434-aca9-685d14b15c62" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547165 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b287388-cee5-4065-9b38-56633ce573c2" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547173 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fce8ef-22cc-4aa4-977b-8cf7382053d5" containerName="mariadb-database-create" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547186 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a123ea-8352-432e-af35-f2b275d1dbbb" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.547204 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" containerName="mariadb-account-create-update" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.548746 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.550763 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.550936 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.561601 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.617820 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.617921 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.617950 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkll\" (UniqueName: \"kubernetes.io/projected/cf90b0b3-45b3-4926-bf71-703b79d4cae4-kube-api-access-ztkll\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.618007 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.618036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.618058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.618142 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.618175 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719158 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719243 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719298 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719340 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719356 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkll\" (UniqueName: \"kubernetes.io/projected/cf90b0b3-45b3-4926-bf71-703b79d4cae4-kube-api-access-ztkll\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719413 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.719842 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.720084 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf90b0b3-45b3-4926-bf71-703b79d4cae4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.723913 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.724993 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.725495 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.730000 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf90b0b3-45b3-4926-bf71-703b79d4cae4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.747599 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkll\" (UniqueName: \"kubernetes.io/projected/cf90b0b3-45b3-4926-bf71-703b79d4cae4-kube-api-access-ztkll\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.750741 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.750773 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b6443358e93f73fc16f0714d0b2e759c539b0d4c74a7e83912ba5d1f8dded95/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.835022 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1c143bc4-07e7-468b-9d64-2a90f3336fc2\") pod \"glance-default-external-api-0\" (UID: \"cf90b0b3-45b3-4926-bf71-703b79d4cae4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:43:25 crc kubenswrapper[4895]: I0320 13:43:25.865172 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:43:26 crc kubenswrapper[4895]: W0320 13:43:26.591224 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf90b0b3_45b3_4926_bf71_703b79d4cae4.slice/crio-aadedfcd0f970562191e64dc7bc8577f57afa7c96968729cc6efb92fd457aa4a WatchSource:0}: Error finding container aadedfcd0f970562191e64dc7bc8577f57afa7c96968729cc6efb92fd457aa4a: Status 404 returned error can't find the container with id aadedfcd0f970562191e64dc7bc8577f57afa7c96968729cc6efb92fd457aa4a Mar 20 13:43:26 crc kubenswrapper[4895]: I0320 13:43:26.592815 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.222379 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a17c32e-3090-49b2-ac2a-91572b5eab39" path="/var/lib/kubelet/pods/9a17c32e-3090-49b2-ac2a-91572b5eab39/volumes" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.292385 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf90b0b3-45b3-4926-bf71-703b79d4cae4","Type":"ContainerStarted","Data":"1171804d6ab338f7de53c23d01ddf9b1ffe80d2cffb8e90568a75bc0c06388b4"} Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.292825 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf90b0b3-45b3-4926-bf71-703b79d4cae4","Type":"ContainerStarted","Data":"aadedfcd0f970562191e64dc7bc8577f57afa7c96968729cc6efb92fd457aa4a"} Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.881448 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-68krz"] Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.882938 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.884935 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2hndf" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.885171 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.887110 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.888977 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.889071 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.889090 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqsg\" (UniqueName: \"kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.889142 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.901335 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-68krz"] Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.990555 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.990643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.990739 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:27 crc kubenswrapper[4895]: I0320 13:43:27.990761 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqsg\" (UniqueName: \"kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:27.998896 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.001033 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.008223 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.019004 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqsg\" (UniqueName: \"kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg\") pod \"nova-cell0-conductor-db-sync-68krz\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.219831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.274801 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.355411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf90b0b3-45b3-4926-bf71-703b79d4cae4","Type":"ContainerStarted","Data":"993b84f36b2e388b2862e1cec3423a7eacec0de49bd695b452d6d15705b003cb"} Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.380650 4895 generic.go:334] "Generic (PLEG): container finished" podID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerID="2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed" exitCode=0 Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.380704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerDied","Data":"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed"} Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.380732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7","Type":"ContainerDied","Data":"d895dc483020f93f1aa4747ce070728e1f6e4e4b101977a8420380cf7d26f51f"} Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.380751 4895 scope.go:117] "RemoveContainer" containerID="2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.380912 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.388704 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.388684321 podStartE2EDuration="3.388684321s" podCreationTimestamp="2026-03-20 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:28.382785656 +0000 UTC m=+1307.892504622" watchObservedRunningTime="2026-03-20 13:43:28.388684321 +0000 UTC m=+1307.898403287" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404117 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404218 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404473 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2szfk\" (UniqueName: \"kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404510 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404553 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404745 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.404781 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run\") pod \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\" (UID: \"d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7\") " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.409655 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs" (OuterVolumeSpecName: "logs") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.415725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.419574 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts" (OuterVolumeSpecName: "scripts") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.424320 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk" (OuterVolumeSpecName: "kube-api-access-2szfk") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "kube-api-access-2szfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.470512 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57" (OuterVolumeSpecName: "glance") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.497025 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506817 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506849 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506863 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2szfk\" (UniqueName: \"kubernetes.io/projected/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-kube-api-access-2szfk\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506885 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") on node \"crc\" " Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506895 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.506904 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.513090 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.571655 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.571895 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57") on node "crc" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.605418 4895 scope.go:117] "RemoveContainer" containerID="30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.609716 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.609993 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.640322 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data" (OuterVolumeSpecName: "config-data") pod "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" (UID: "d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.711777 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.755901 4895 scope.go:117] "RemoveContainer" containerID="2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.759447 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:28 crc kubenswrapper[4895]: E0320 13:43:28.759571 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed\": container with ID starting with 2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed not found: ID does not exist" containerID="2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.759627 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed"} err="failed to get container status \"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed\": rpc error: code = NotFound desc = could not find container \"2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed\": container with ID starting with 2dd811dd7ce3c2c2e204aa01602285e00dadcf858750c3579465ba810e4b74ed not found: ID does not exist" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.759657 4895 scope.go:117] "RemoveContainer" containerID="30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821" Mar 20 13:43:28 crc kubenswrapper[4895]: E0320 13:43:28.765592 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821\": container with ID starting with 30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821 not found: ID does not exist" containerID="30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.765637 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821"} err="failed to get container status \"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821\": rpc error: code = NotFound desc = could not find container \"30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821\": container with ID starting with 30961b7b02e9dc857994721f8fe158ad7c2ac39071ea30842439add31a8a7821 not found: ID does not exist" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.769975 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.778062 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:28 crc kubenswrapper[4895]: E0320 13:43:28.778459 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-httpd" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.778476 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-httpd" Mar 20 13:43:28 crc kubenswrapper[4895]: E0320 13:43:28.778505 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-log" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.778512 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-log" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.778703 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-log" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.778724 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" containerName="glance-httpd" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.780167 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.783075 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.783426 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.807808 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.873059 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.873332 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-central-agent" containerID="cri-o://8a4d0881e7dc2a317e857493d749429190ac8532cedabaf742fb8816324549f5" gracePeriod=30 Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.874928 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="proxy-httpd" containerID="cri-o://e6538945b8467358fd3380b527f24b0ab1c04855501067af81d37d5d3307e2d9" gracePeriod=30 Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.875063 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="sg-core" containerID="cri-o://4d9f056665c4ff9f7d17d72e292ac1aceb8ff35ffb407868fce50f685b4a121a" gracePeriod=30 Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.875122 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-notification-agent" containerID="cri-o://e850a33656701d162db75c000ab3002f1a862e55b1380a1b1e43ee7ce3ca1160" gracePeriod=30 Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915101 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915168 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnglv\" (UniqueName: \"kubernetes.io/projected/06d364d4-5809-40d8-8e14-11ae873d4c47-kube-api-access-vnglv\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915267 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915288 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915330 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915357 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.915481 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:28 crc kubenswrapper[4895]: I0320 13:43:28.965432 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-68krz"] Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnglv\" (UniqueName: \"kubernetes.io/projected/06d364d4-5809-40d8-8e14-11ae873d4c47-kube-api-access-vnglv\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019775 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019801 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.019852 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.020365 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.020612 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d364d4-5809-40d8-8e14-11ae873d4c47-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.034602 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.040203 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.068314 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.068469 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/18ebb825098e65852293f2b0f63099f5113b6726c6c2675c80c59a63de5999b9/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.069997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.080658 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d364d4-5809-40d8-8e14-11ae873d4c47-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.081147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnglv\" (UniqueName: \"kubernetes.io/projected/06d364d4-5809-40d8-8e14-11ae873d4c47-kube-api-access-vnglv\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.231312 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7" path="/var/lib/kubelet/pods/d6b9f3d1-2bfd-43a4-ba9e-2f5008d865d7/volumes" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.239117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c7bae2e4-3210-4dcf-a98c-324f929f7a57\") pod \"glance-default-internal-api-0\" (UID: \"06d364d4-5809-40d8-8e14-11ae873d4c47\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.406836 4895 generic.go:334] "Generic (PLEG): container finished" podID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerID="e6538945b8467358fd3380b527f24b0ab1c04855501067af81d37d5d3307e2d9" exitCode=0 Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.406876 4895 generic.go:334] "Generic (PLEG): container finished" podID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerID="4d9f056665c4ff9f7d17d72e292ac1aceb8ff35ffb407868fce50f685b4a121a" exitCode=2 Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.406923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerDied","Data":"e6538945b8467358fd3380b527f24b0ab1c04855501067af81d37d5d3307e2d9"} Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.407006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerDied","Data":"4d9f056665c4ff9f7d17d72e292ac1aceb8ff35ffb407868fce50f685b4a121a"} Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.407963 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.409997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-68krz" event={"ID":"2b0f7494-037c-462e-bd52-4a4d2469c62d","Type":"ContainerStarted","Data":"decf57fbe71c454685cba312a919807bd43b02c98639638e7ec8030cbb2793a2"} Mar 20 13:43:29 crc kubenswrapper[4895]: I0320 13:43:29.781487 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.078686 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.429908 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d364d4-5809-40d8-8e14-11ae873d4c47","Type":"ContainerStarted","Data":"09ef4c551cdd4e84f6dcabd654873456aa8f950ba483ce4c309fe90175a2b62d"} Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.435260 4895 generic.go:334] "Generic (PLEG): container finished" podID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerID="e850a33656701d162db75c000ab3002f1a862e55b1380a1b1e43ee7ce3ca1160" exitCode=0 Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.435293 4895 generic.go:334] "Generic (PLEG): container finished" podID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerID="8a4d0881e7dc2a317e857493d749429190ac8532cedabaf742fb8816324549f5" exitCode=0 Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.435311 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerDied","Data":"e850a33656701d162db75c000ab3002f1a862e55b1380a1b1e43ee7ce3ca1160"} Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.435333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerDied","Data":"8a4d0881e7dc2a317e857493d749429190ac8532cedabaf742fb8816324549f5"} Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.636355 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.764984 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.765178 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.765843 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.765906 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.765966 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxfrj\" (UniqueName: \"kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.765982 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.766229 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle\") pod \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\" (UID: \"45105c76-5ee7-447e-a4c8-22cb30dbc6ff\") " Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.769441 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.769667 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.786655 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts" (OuterVolumeSpecName: "scripts") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.786724 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj" (OuterVolumeSpecName: "kube-api-access-rxfrj") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "kube-api-access-rxfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.840198 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.868234 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxfrj\" (UniqueName: \"kubernetes.io/projected/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-kube-api-access-rxfrj\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.868271 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.868283 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.868293 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.868302 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.917915 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:30 crc kubenswrapper[4895]: I0320 13:43:30.970268 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.010207 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data" (OuterVolumeSpecName: "config-data") pod "45105c76-5ee7-447e-a4c8-22cb30dbc6ff" (UID: "45105c76-5ee7-447e-a4c8-22cb30dbc6ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.077852 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45105c76-5ee7-447e-a4c8-22cb30dbc6ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.456755 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d364d4-5809-40d8-8e14-11ae873d4c47","Type":"ContainerStarted","Data":"13256c7892f71581837830f1031fd9cbcb97fe5efc394e9e1074992755dcbc77"} Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.472758 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45105c76-5ee7-447e-a4c8-22cb30dbc6ff","Type":"ContainerDied","Data":"b53cb26e1148c79bb7b061a6044572898936997f183aa4d01ea75410fd09b6cc"} Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.472815 4895 scope.go:117] "RemoveContainer" containerID="e6538945b8467358fd3380b527f24b0ab1c04855501067af81d37d5d3307e2d9" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.472981 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.512503 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.521142 4895 scope.go:117] "RemoveContainer" containerID="4d9f056665c4ff9f7d17d72e292ac1aceb8ff35ffb407868fce50f685b4a121a" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.543260 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.560164 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:31 crc kubenswrapper[4895]: E0320 13:43:31.560901 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-notification-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.560933 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-notification-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: E0320 13:43:31.560942 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="sg-core" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.560949 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="sg-core" Mar 20 13:43:31 crc kubenswrapper[4895]: E0320 13:43:31.560961 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="proxy-httpd" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.560968 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="proxy-httpd" Mar 20 13:43:31 crc kubenswrapper[4895]: E0320 13:43:31.560983 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-central-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.560989 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-central-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.561258 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="proxy-httpd" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.561274 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-central-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.561284 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="ceilometer-notification-agent" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.561303 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" containerName="sg-core" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.563190 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.565054 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.571240 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.571446 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.617038 4895 scope.go:117] "RemoveContainer" containerID="e850a33656701d162db75c000ab3002f1a862e55b1380a1b1e43ee7ce3ca1160" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.684281 4895 scope.go:117] "RemoveContainer" containerID="8a4d0881e7dc2a317e857493d749429190ac8532cedabaf742fb8816324549f5" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695414 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695449 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695487 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695528 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695552 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn7dr\" (UniqueName: \"kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.695622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.804425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.804739 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.804918 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.805776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.806262 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.806497 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.806867 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn7dr\" (UniqueName: \"kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.807087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.807866 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.812740 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.814228 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.818257 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.835862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.841582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn7dr\" (UniqueName: \"kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr\") pod \"ceilometer-0\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " pod="openstack/ceilometer-0" Mar 20 13:43:31 crc kubenswrapper[4895]: I0320 13:43:31.900913 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:32 crc kubenswrapper[4895]: I0320 13:43:32.425246 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:32 crc kubenswrapper[4895]: W0320 13:43:32.436073 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1473bd0d_0acb_4710_b3d9_02dfb027cb02.slice/crio-9d44268f13fbf2c5eb8a0ac2fd89123f9009894da8dab0801c09b00e34e6b403 WatchSource:0}: Error finding container 9d44268f13fbf2c5eb8a0ac2fd89123f9009894da8dab0801c09b00e34e6b403: Status 404 returned error can't find the container with id 9d44268f13fbf2c5eb8a0ac2fd89123f9009894da8dab0801c09b00e34e6b403 Mar 20 13:43:32 crc kubenswrapper[4895]: I0320 13:43:32.506375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d364d4-5809-40d8-8e14-11ae873d4c47","Type":"ContainerStarted","Data":"b69acc4d86b343edccf19e3249a9dd5ba52a47fa22b93648304dd1ca3e5fed57"} Mar 20 13:43:32 crc kubenswrapper[4895]: I0320 13:43:32.507781 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerStarted","Data":"9d44268f13fbf2c5eb8a0ac2fd89123f9009894da8dab0801c09b00e34e6b403"} Mar 20 13:43:32 crc kubenswrapper[4895]: I0320 13:43:32.532191 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.532170671 podStartE2EDuration="4.532170671s" podCreationTimestamp="2026-03-20 13:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:32.525934498 +0000 UTC m=+1312.035653464" watchObservedRunningTime="2026-03-20 13:43:32.532170671 +0000 UTC m=+1312.041889637" Mar 20 13:43:33 crc kubenswrapper[4895]: I0320 13:43:33.224833 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45105c76-5ee7-447e-a4c8-22cb30dbc6ff" path="/var/lib/kubelet/pods/45105c76-5ee7-447e-a4c8-22cb30dbc6ff/volumes" Mar 20 13:43:33 crc kubenswrapper[4895]: I0320 13:43:33.225842 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:33 crc kubenswrapper[4895]: I0320 13:43:33.519913 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerStarted","Data":"5cf1c980135b375f88ec3cec244efda61a807f5ed19505fb95cf369091a39597"} Mar 20 13:43:34 crc kubenswrapper[4895]: I0320 13:43:34.560447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerStarted","Data":"a22aa6fc63bdcda8490ada87ea56f39b03e677fa6e0286ae28562dd62f336acb"} Mar 20 13:43:34 crc kubenswrapper[4895]: I0320 13:43:34.561037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerStarted","Data":"cc83c162f503629e2610976defc0623b3f0d387ebad7312aaae57ed0f2f93204"} Mar 20 13:43:35 crc kubenswrapper[4895]: I0320 13:43:35.865969 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:43:35 crc kubenswrapper[4895]: I0320 13:43:35.866261 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:43:35 crc kubenswrapper[4895]: I0320 13:43:35.912413 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:43:35 crc kubenswrapper[4895]: I0320 13:43:35.940837 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:43:36 crc kubenswrapper[4895]: I0320 13:43:36.582867 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:43:36 crc kubenswrapper[4895]: I0320 13:43:36.583188 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.408627 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.409234 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.446631 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.459803 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.619141 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:39 crc kubenswrapper[4895]: I0320 13:43:39.619217 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:40 crc kubenswrapper[4895]: I0320 13:43:40.924171 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:43:40 crc kubenswrapper[4895]: I0320 13:43:40.924531 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:43:40 crc kubenswrapper[4895]: I0320 13:43:40.929686 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:43:42 crc kubenswrapper[4895]: I0320 13:43:42.399340 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:42 crc kubenswrapper[4895]: I0320 13:43:42.399848 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:43:42 crc kubenswrapper[4895]: I0320 13:43:42.405653 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.678683 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerStarted","Data":"a9e0ff85916fe872427ea295dfd33c299620917cdbe2afe1b5a5d440e6c3f6b6"} Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.679310 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.679080 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-notification-agent" containerID="cri-o://cc83c162f503629e2610976defc0623b3f0d387ebad7312aaae57ed0f2f93204" gracePeriod=30 Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.678840 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-central-agent" containerID="cri-o://5cf1c980135b375f88ec3cec244efda61a807f5ed19505fb95cf369091a39597" gracePeriod=30 Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.679335 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="sg-core" containerID="cri-o://a22aa6fc63bdcda8490ada87ea56f39b03e677fa6e0286ae28562dd62f336acb" gracePeriod=30 Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.679113 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="proxy-httpd" containerID="cri-o://a9e0ff85916fe872427ea295dfd33c299620917cdbe2afe1b5a5d440e6c3f6b6" gracePeriod=30 Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.684361 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-68krz" event={"ID":"2b0f7494-037c-462e-bd52-4a4d2469c62d","Type":"ContainerStarted","Data":"66ba7de2466ae09e820baab9478a0a185a4d58d159afdaa143a532cdb5c982b5"} Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.711123 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7078095429999998 podStartE2EDuration="14.71110173s" podCreationTimestamp="2026-03-20 13:43:31 +0000 UTC" firstStartedPulling="2026-03-20 13:43:32.440968325 +0000 UTC m=+1311.950687291" lastFinishedPulling="2026-03-20 13:43:44.444260502 +0000 UTC m=+1323.953979478" observedRunningTime="2026-03-20 13:43:45.702569449 +0000 UTC m=+1325.212288435" watchObservedRunningTime="2026-03-20 13:43:45.71110173 +0000 UTC m=+1325.220820696" Mar 20 13:43:45 crc kubenswrapper[4895]: I0320 13:43:45.725720 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-68krz" podStartSLOduration=3.266993061 podStartE2EDuration="18.725701579s" podCreationTimestamp="2026-03-20 13:43:27 +0000 UTC" firstStartedPulling="2026-03-20 13:43:28.985191256 +0000 UTC m=+1308.494910222" lastFinishedPulling="2026-03-20 13:43:44.443899774 +0000 UTC m=+1323.953618740" observedRunningTime="2026-03-20 13:43:45.716276417 +0000 UTC m=+1325.225995383" watchObservedRunningTime="2026-03-20 13:43:45.725701579 +0000 UTC m=+1325.235420545" Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.710997 4895 generic.go:334] "Generic (PLEG): container finished" podID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerID="a9e0ff85916fe872427ea295dfd33c299620917cdbe2afe1b5a5d440e6c3f6b6" exitCode=0 Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711233 4895 generic.go:334] "Generic (PLEG): container finished" podID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerID="a22aa6fc63bdcda8490ada87ea56f39b03e677fa6e0286ae28562dd62f336acb" exitCode=2 Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711244 4895 generic.go:334] "Generic (PLEG): container finished" podID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerID="cc83c162f503629e2610976defc0623b3f0d387ebad7312aaae57ed0f2f93204" exitCode=0 Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711250 4895 generic.go:334] "Generic (PLEG): container finished" podID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerID="5cf1c980135b375f88ec3cec244efda61a807f5ed19505fb95cf369091a39597" exitCode=0 Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711365 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerDied","Data":"a9e0ff85916fe872427ea295dfd33c299620917cdbe2afe1b5a5d440e6c3f6b6"} Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711463 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerDied","Data":"a22aa6fc63bdcda8490ada87ea56f39b03e677fa6e0286ae28562dd62f336acb"} Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711487 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerDied","Data":"cc83c162f503629e2610976defc0623b3f0d387ebad7312aaae57ed0f2f93204"} Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.711500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerDied","Data":"5cf1c980135b375f88ec3cec244efda61a807f5ed19505fb95cf369091a39597"} Mar 20 13:43:46 crc kubenswrapper[4895]: I0320 13:43:46.980848 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164125 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164194 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn7dr\" (UniqueName: \"kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164323 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164440 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164461 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164536 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164910 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.164984 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.165054 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.169619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr" (OuterVolumeSpecName: "kube-api-access-sn7dr") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "kube-api-access-sn7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.172190 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts" (OuterVolumeSpecName: "scripts") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.195441 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.250124 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.265498 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data" (OuterVolumeSpecName: "config-data") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.266687 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") pod \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\" (UID: \"1473bd0d-0acb-4710-b3d9-02dfb027cb02\") " Mar 20 13:43:47 crc kubenswrapper[4895]: W0320 13:43:47.266860 4895 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1473bd0d-0acb-4710-b3d9-02dfb027cb02/volumes/kubernetes.io~secret/config-data Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.266893 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data" (OuterVolumeSpecName: "config-data") pod "1473bd0d-0acb-4710-b3d9-02dfb027cb02" (UID: "1473bd0d-0acb-4710-b3d9-02dfb027cb02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.267927 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.268039 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.268123 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1473bd0d-0acb-4710-b3d9-02dfb027cb02-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.268211 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.268289 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1473bd0d-0acb-4710-b3d9-02dfb027cb02-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.268363 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn7dr\" (UniqueName: \"kubernetes.io/projected/1473bd0d-0acb-4710-b3d9-02dfb027cb02-kube-api-access-sn7dr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.723380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1473bd0d-0acb-4710-b3d9-02dfb027cb02","Type":"ContainerDied","Data":"9d44268f13fbf2c5eb8a0ac2fd89123f9009894da8dab0801c09b00e34e6b403"} Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.723445 4895 scope.go:117] "RemoveContainer" containerID="a9e0ff85916fe872427ea295dfd33c299620917cdbe2afe1b5a5d440e6c3f6b6" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.723495 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.756404 4895 scope.go:117] "RemoveContainer" containerID="a22aa6fc63bdcda8490ada87ea56f39b03e677fa6e0286ae28562dd62f336acb" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.762064 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.775117 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.791118 4895 scope.go:117] "RemoveContainer" containerID="cc83c162f503629e2610976defc0623b3f0d387ebad7312aaae57ed0f2f93204" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.802320 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:47 crc kubenswrapper[4895]: E0320 13:43:47.802939 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-central-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.802962 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-central-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: E0320 13:43:47.802981 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="proxy-httpd" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.802989 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="proxy-httpd" Mar 20 13:43:47 crc kubenswrapper[4895]: E0320 13:43:47.803006 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="sg-core" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803014 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="sg-core" Mar 20 13:43:47 crc kubenswrapper[4895]: E0320 13:43:47.803030 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-notification-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803036 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-notification-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803276 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="sg-core" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803305 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="proxy-httpd" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803329 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-notification-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.803343 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" containerName="ceilometer-central-agent" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.806635 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.809017 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.811522 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.836951 4895 scope.go:117] "RemoveContainer" containerID="5cf1c980135b375f88ec3cec244efda61a807f5ed19505fb95cf369091a39597" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.857182 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.983200 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.983471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.983821 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.983952 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxsx\" (UniqueName: \"kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.984012 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.984089 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:47 crc kubenswrapper[4895]: I0320 13:43:47.984173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086015 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086082 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxsx\" (UniqueName: \"kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086106 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086160 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086204 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086719 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.086751 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.092111 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.093110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.096925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.103654 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxsx\" (UniqueName: \"kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.114229 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.127994 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:43:48 crc kubenswrapper[4895]: W0320 13:43:48.610702 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd370eb51_2043_498f_b94a_11ac6f56f65f.slice/crio-9597a2c35a2d6fbe65c036a877e2bad0ddacbaa5b56726f143b2b86539b01b2b WatchSource:0}: Error finding container 9597a2c35a2d6fbe65c036a877e2bad0ddacbaa5b56726f143b2b86539b01b2b: Status 404 returned error can't find the container with id 9597a2c35a2d6fbe65c036a877e2bad0ddacbaa5b56726f143b2b86539b01b2b Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.622805 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:43:48 crc kubenswrapper[4895]: I0320 13:43:48.734001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerStarted","Data":"9597a2c35a2d6fbe65c036a877e2bad0ddacbaa5b56726f143b2b86539b01b2b"} Mar 20 13:43:49 crc kubenswrapper[4895]: I0320 13:43:49.249925 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1473bd0d-0acb-4710-b3d9-02dfb027cb02" path="/var/lib/kubelet/pods/1473bd0d-0acb-4710-b3d9-02dfb027cb02/volumes" Mar 20 13:43:49 crc kubenswrapper[4895]: I0320 13:43:49.745857 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerStarted","Data":"682f274553a20375f129c760c6bdd471b988822bc84ea29b332969845985d7b0"} Mar 20 13:43:50 crc kubenswrapper[4895]: I0320 13:43:50.755948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerStarted","Data":"2b74cbd2a13b39a194822da40aae23e8a89f8475d1cb598ba7c73a080ba9aae0"} Mar 20 13:43:51 crc kubenswrapper[4895]: I0320 13:43:51.766615 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerStarted","Data":"5ee9774ceccc6ac6611b1e14382e6f044426d251a6babd1fc6ce66e4fd3eca91"} Mar 20 13:43:52 crc kubenswrapper[4895]: I0320 13:43:52.778943 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerStarted","Data":"a83cc59874e33484146334881b4d4c84f236229de64fc23224b06cc64a03c8b9"} Mar 20 13:43:52 crc kubenswrapper[4895]: I0320 13:43:52.779399 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:43:52 crc kubenswrapper[4895]: I0320 13:43:52.801314 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.042852164 podStartE2EDuration="5.801294009s" podCreationTimestamp="2026-03-20 13:43:47 +0000 UTC" firstStartedPulling="2026-03-20 13:43:48.61281654 +0000 UTC m=+1328.122535506" lastFinishedPulling="2026-03-20 13:43:52.371258385 +0000 UTC m=+1331.880977351" observedRunningTime="2026-03-20 13:43:52.796050999 +0000 UTC m=+1332.305769975" watchObservedRunningTime="2026-03-20 13:43:52.801294009 +0000 UTC m=+1332.311012975" Mar 20 13:43:57 crc kubenswrapper[4895]: I0320 13:43:57.826629 4895 generic.go:334] "Generic (PLEG): container finished" podID="2b0f7494-037c-462e-bd52-4a4d2469c62d" containerID="66ba7de2466ae09e820baab9478a0a185a4d58d159afdaa143a532cdb5c982b5" exitCode=0 Mar 20 13:43:57 crc kubenswrapper[4895]: I0320 13:43:57.826703 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-68krz" event={"ID":"2b0f7494-037c-462e-bd52-4a4d2469c62d","Type":"ContainerDied","Data":"66ba7de2466ae09e820baab9478a0a185a4d58d159afdaa143a532cdb5c982b5"} Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.511754 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.608840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhqsg\" (UniqueName: \"kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg\") pod \"2b0f7494-037c-462e-bd52-4a4d2469c62d\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.608943 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data\") pod \"2b0f7494-037c-462e-bd52-4a4d2469c62d\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.609129 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle\") pod \"2b0f7494-037c-462e-bd52-4a4d2469c62d\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.609162 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts\") pod \"2b0f7494-037c-462e-bd52-4a4d2469c62d\" (UID: \"2b0f7494-037c-462e-bd52-4a4d2469c62d\") " Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.617169 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts" (OuterVolumeSpecName: "scripts") pod "2b0f7494-037c-462e-bd52-4a4d2469c62d" (UID: "2b0f7494-037c-462e-bd52-4a4d2469c62d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.622486 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg" (OuterVolumeSpecName: "kube-api-access-qhqsg") pod "2b0f7494-037c-462e-bd52-4a4d2469c62d" (UID: "2b0f7494-037c-462e-bd52-4a4d2469c62d"). InnerVolumeSpecName "kube-api-access-qhqsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.642408 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b0f7494-037c-462e-bd52-4a4d2469c62d" (UID: "2b0f7494-037c-462e-bd52-4a4d2469c62d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.650540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data" (OuterVolumeSpecName: "config-data") pod "2b0f7494-037c-462e-bd52-4a4d2469c62d" (UID: "2b0f7494-037c-462e-bd52-4a4d2469c62d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.711783 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhqsg\" (UniqueName: \"kubernetes.io/projected/2b0f7494-037c-462e-bd52-4a4d2469c62d-kube-api-access-qhqsg\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.711824 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.711834 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.711842 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b0f7494-037c-462e-bd52-4a4d2469c62d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.846649 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-68krz" event={"ID":"2b0f7494-037c-462e-bd52-4a4d2469c62d","Type":"ContainerDied","Data":"decf57fbe71c454685cba312a919807bd43b02c98639638e7ec8030cbb2793a2"} Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.846700 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="decf57fbe71c454685cba312a919807bd43b02c98639638e7ec8030cbb2793a2" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.846704 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-68krz" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.988454 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:43:59 crc kubenswrapper[4895]: E0320 13:43:59.989128 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b0f7494-037c-462e-bd52-4a4d2469c62d" containerName="nova-cell0-conductor-db-sync" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.989144 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b0f7494-037c-462e-bd52-4a4d2469c62d" containerName="nova-cell0-conductor-db-sync" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.989331 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b0f7494-037c-462e-bd52-4a4d2469c62d" containerName="nova-cell0-conductor-db-sync" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.989981 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.993423 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:43:59 crc kubenswrapper[4895]: I0320 13:43:59.993525 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2hndf" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.000059 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.119043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.119141 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmr2\" (UniqueName: \"kubernetes.io/projected/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-kube-api-access-bgmr2\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.119183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.138286 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-6m8fs"] Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.140002 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.143289 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.143799 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.144190 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.158204 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-6m8fs"] Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.221166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4x6n\" (UniqueName: \"kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n\") pod \"auto-csr-approver-29566904-6m8fs\" (UID: \"998b603e-c3b3-48a4-8c84-db84434afa48\") " pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.221328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.221412 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmr2\" (UniqueName: \"kubernetes.io/projected/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-kube-api-access-bgmr2\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.221448 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.230695 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.230768 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.239019 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmr2\" (UniqueName: \"kubernetes.io/projected/4b4528c7-f8b8-4f3c-b86b-a803fee7d982-kube-api-access-bgmr2\") pod \"nova-cell0-conductor-0\" (UID: \"4b4528c7-f8b8-4f3c-b86b-a803fee7d982\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.322785 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4x6n\" (UniqueName: \"kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n\") pod \"auto-csr-approver-29566904-6m8fs\" (UID: \"998b603e-c3b3-48a4-8c84-db84434afa48\") " pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.342114 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4x6n\" (UniqueName: \"kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n\") pod \"auto-csr-approver-29566904-6m8fs\" (UID: \"998b603e-c3b3-48a4-8c84-db84434afa48\") " pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.381977 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.465702 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:00 crc kubenswrapper[4895]: W0320 13:44:00.921712 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b4528c7_f8b8_4f3c_b86b_a803fee7d982.slice/crio-c33481b76d2a24a27f34eec5634e40ba78340ab43edc74aff14088e76650c101 WatchSource:0}: Error finding container c33481b76d2a24a27f34eec5634e40ba78340ab43edc74aff14088e76650c101: Status 404 returned error can't find the container with id c33481b76d2a24a27f34eec5634e40ba78340ab43edc74aff14088e76650c101 Mar 20 13:44:00 crc kubenswrapper[4895]: I0320 13:44:00.928010 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.127602 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-6m8fs"] Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.865144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" event={"ID":"998b603e-c3b3-48a4-8c84-db84434afa48","Type":"ContainerStarted","Data":"2c60c1378c390488b25646a357fea8539ac7422c81e1d009d5eb70540cf9f8b2"} Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.866779 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b4528c7-f8b8-4f3c-b86b-a803fee7d982","Type":"ContainerStarted","Data":"d54e2c231157cc23a572442ec313b7a17521b81a460eba4c3a1622324612c1dc"} Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.866820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b4528c7-f8b8-4f3c-b86b-a803fee7d982","Type":"ContainerStarted","Data":"c33481b76d2a24a27f34eec5634e40ba78340ab43edc74aff14088e76650c101"} Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.866930 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:01 crc kubenswrapper[4895]: I0320 13:44:01.883499 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.883466709 podStartE2EDuration="2.883466709s" podCreationTimestamp="2026-03-20 13:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:01.879159822 +0000 UTC m=+1341.388878788" watchObservedRunningTime="2026-03-20 13:44:01.883466709 +0000 UTC m=+1341.393185675" Mar 20 13:44:02 crc kubenswrapper[4895]: I0320 13:44:02.877626 4895 generic.go:334] "Generic (PLEG): container finished" podID="998b603e-c3b3-48a4-8c84-db84434afa48" containerID="dc60ae3f9fae0b7e021db4f4cddb0b76499aebb399d4feabb075ca1db475c83c" exitCode=0 Mar 20 13:44:02 crc kubenswrapper[4895]: I0320 13:44:02.877730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" event={"ID":"998b603e-c3b3-48a4-8c84-db84434afa48","Type":"ContainerDied","Data":"dc60ae3f9fae0b7e021db4f4cddb0b76499aebb399d4feabb075ca1db475c83c"} Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.564352 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.721571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4x6n\" (UniqueName: \"kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n\") pod \"998b603e-c3b3-48a4-8c84-db84434afa48\" (UID: \"998b603e-c3b3-48a4-8c84-db84434afa48\") " Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.726968 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n" (OuterVolumeSpecName: "kube-api-access-x4x6n") pod "998b603e-c3b3-48a4-8c84-db84434afa48" (UID: "998b603e-c3b3-48a4-8c84-db84434afa48"). InnerVolumeSpecName "kube-api-access-x4x6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.825131 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4x6n\" (UniqueName: \"kubernetes.io/projected/998b603e-c3b3-48a4-8c84-db84434afa48-kube-api-access-x4x6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.898326 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" event={"ID":"998b603e-c3b3-48a4-8c84-db84434afa48","Type":"ContainerDied","Data":"2c60c1378c390488b25646a357fea8539ac7422c81e1d009d5eb70540cf9f8b2"} Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.898369 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-6m8fs" Mar 20 13:44:04 crc kubenswrapper[4895]: I0320 13:44:04.898375 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c60c1378c390488b25646a357fea8539ac7422c81e1d009d5eb70540cf9f8b2" Mar 20 13:44:05 crc kubenswrapper[4895]: I0320 13:44:05.641451 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-smz2g"] Mar 20 13:44:05 crc kubenswrapper[4895]: I0320 13:44:05.653826 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-smz2g"] Mar 20 13:44:07 crc kubenswrapper[4895]: I0320 13:44:07.226043 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d152aade-646d-44e1-b484-b59672468f56" path="/var/lib/kubelet/pods/d152aade-646d-44e1-b484-b59672468f56/volumes" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.413019 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.933788 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hnr4b"] Mar 20 13:44:10 crc kubenswrapper[4895]: E0320 13:44:10.934283 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998b603e-c3b3-48a4-8c84-db84434afa48" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.934302 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="998b603e-c3b3-48a4-8c84-db84434afa48" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.934563 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="998b603e-c3b3-48a4-8c84-db84434afa48" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.935443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.937219 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.937979 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:44:10 crc kubenswrapper[4895]: I0320 13:44:10.949507 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hnr4b"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.044433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtf5\" (UniqueName: \"kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.044487 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.044763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.044851 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.141332 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.143074 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.146377 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.147076 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtf5\" (UniqueName: \"kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.147173 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.147327 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.147378 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.157556 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.162421 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.172242 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.176043 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.196046 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtf5\" (UniqueName: \"kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5\") pod \"nova-cell0-cell-mapping-hnr4b\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.258825 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.261620 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzfl6\" (UniqueName: \"kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.261809 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.261936 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.290263 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.302009 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.311830 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.324457 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.326974 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.337743 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.362015 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364341 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364504 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdjx5\" (UniqueName: \"kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364545 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364613 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzfl6\" (UniqueName: \"kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364639 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.364681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.395178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.396895 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.424156 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.439025 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzfl6\" (UniqueName: \"kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6\") pod \"nova-scheduler-0\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdjx5\" (UniqueName: \"kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466677 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466722 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55d78\" (UniqueName: \"kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466741 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466776 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466887 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.466907 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.470667 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.470958 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.473989 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.509508 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdjx5\" (UniqueName: \"kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5\") pod \"nova-metadata-0\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.550086 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.552094 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.563743 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.564983 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.566437 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.573550 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.573586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.573681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55d78\" (UniqueName: \"kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.573749 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.578656 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.579036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.579486 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.585570 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.591435 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.598374 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.619367 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55d78\" (UniqueName: \"kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78\") pod \"nova-api-0\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675708 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675842 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdlx8\" (UniqueName: \"kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qsgd\" (UniqueName: \"kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675898 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675923 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.675985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.676009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779514 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779627 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdlx8\" (UniqueName: \"kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779670 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qsgd\" (UniqueName: \"kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779702 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779731 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779753 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779791 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.779811 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.780728 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.784059 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.784687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.785010 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.785253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.788194 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.792943 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.794909 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.797110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.812649 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qsgd\" (UniqueName: \"kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd\") pod \"dnsmasq-dns-884c8b8f5-jkq4x\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.820847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdlx8\" (UniqueName: \"kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:11 crc kubenswrapper[4895]: I0320 13:44:11.899883 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.007871 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.045168 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hnr4b"] Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.307660 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.622462 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.717596 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6sw4l"] Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.719013 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.721188 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.721365 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.730735 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6sw4l"] Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.918049 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.918100 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.918147 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:12 crc kubenswrapper[4895]: I0320 13:44:12.918209 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4j2\" (UniqueName: \"kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.019514 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.019845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.020535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.020623 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4j2\" (UniqueName: \"kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.027460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.033081 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.035191 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.042262 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4j2\" (UniqueName: \"kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2\") pod \"nova-cell1-conductor-db-sync-6sw4l\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.082851 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hnr4b" event={"ID":"92cb01ad-b24f-4840-b7d8-6118730ac633","Type":"ContainerStarted","Data":"b85da950b26165d40e7a5917881f1c1c1993e9abc3472d57087bb6351ded117d"} Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.083194 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hnr4b" event={"ID":"92cb01ad-b24f-4840-b7d8-6118730ac633","Type":"ContainerStarted","Data":"24d72c8949b71d5af60b856a09bfa1dc38c0c105e62cba0bd7527a3185a5b27b"} Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.085958 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerStarted","Data":"a8d6b2310a267f59a7353242934889d108295e103dddd647345ed33aa734058c"} Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.090616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bbda25c-946b-497c-afef-622259be6557","Type":"ContainerStarted","Data":"a3287f3a0e81ccb7e84aa8a4e801a0e4e4235d39b31893bbfe83494f33cd90ba"} Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.305029 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hnr4b" podStartSLOduration=3.305009966 podStartE2EDuration="3.305009966s" podCreationTimestamp="2026-03-20 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:13.103579653 +0000 UTC m=+1352.613298619" watchObservedRunningTime="2026-03-20 13:44:13.305009966 +0000 UTC m=+1352.814728932" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.306853 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.323303 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.341536 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.346070 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:13 crc kubenswrapper[4895]: I0320 13:44:13.516729 4895 scope.go:117] "RemoveContainer" containerID="5b77b0afde30292649e2437e895cf828f3d3d843a6e9b9834667d43b48e1903b" Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.125353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"438b09cd-fd26-4ad4-a095-a63130a8e2f7","Type":"ContainerStarted","Data":"a86cef7a7d9d7b0fd80c5f78cf31054a576c565bbdd54072cf6545629347b760"} Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.141503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerStarted","Data":"239e26060116c9d8d0047c2f4dfe1e12633f4eb73012a1dbb8ebc062e36d14ff"} Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.171919 4895 generic.go:334] "Generic (PLEG): container finished" podID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerID="b310e722bb88098ec3f93098c414ebe861a79c4841e205a7f00791b9b36ed8fb" exitCode=0 Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.172840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" event={"ID":"fbf08ec8-de92-4326-9567-c6fe64dfa07e","Type":"ContainerDied","Data":"b310e722bb88098ec3f93098c414ebe861a79c4841e205a7f00791b9b36ed8fb"} Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.172880 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" event={"ID":"fbf08ec8-de92-4326-9567-c6fe64dfa07e","Type":"ContainerStarted","Data":"9156313be49edf7d56594beb875f478b5514d2242481276d5354df571eb3f03f"} Mar 20 13:44:14 crc kubenswrapper[4895]: I0320 13:44:14.360317 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6sw4l"] Mar 20 13:44:15 crc kubenswrapper[4895]: I0320 13:44:15.187852 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" event={"ID":"23e0421a-d787-4327-b7bf-b4f974871690","Type":"ContainerStarted","Data":"2eb0b9c504fb5df63dba7cd715bc4be968cb4151c188362852f6f72daed58d2b"} Mar 20 13:44:15 crc kubenswrapper[4895]: I0320 13:44:15.473027 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:15 crc kubenswrapper[4895]: I0320 13:44:15.489185 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:16 crc kubenswrapper[4895]: I0320 13:44:16.204547 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" event={"ID":"fbf08ec8-de92-4326-9567-c6fe64dfa07e","Type":"ContainerStarted","Data":"7140b863324081883ed3fb6bbcbc4576d67787181aeca296da749830fde15fc9"} Mar 20 13:44:16 crc kubenswrapper[4895]: I0320 13:44:16.204864 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:16 crc kubenswrapper[4895]: I0320 13:44:16.228318 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" podStartSLOduration=5.228299478 podStartE2EDuration="5.228299478s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:16.222117995 +0000 UTC m=+1355.731836971" watchObservedRunningTime="2026-03-20 13:44:16.228299478 +0000 UTC m=+1355.738018444" Mar 20 13:44:17 crc kubenswrapper[4895]: I0320 13:44:17.237616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" event={"ID":"23e0421a-d787-4327-b7bf-b4f974871690","Type":"ContainerStarted","Data":"c09da54244a40e429af0155a9927e13121445bcdc6fc0a3d2259bb590d5d6e30"} Mar 20 13:44:17 crc kubenswrapper[4895]: I0320 13:44:17.261272 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" podStartSLOduration=5.261252754 podStartE2EDuration="5.261252754s" podCreationTimestamp="2026-03-20 13:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:17.260996226 +0000 UTC m=+1356.770715192" watchObservedRunningTime="2026-03-20 13:44:17.261252754 +0000 UTC m=+1356.770971720" Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.169422 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.276854 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bbda25c-946b-497c-afef-622259be6557","Type":"ContainerStarted","Data":"b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01"} Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.289651 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerStarted","Data":"57b2e886e9c279023fa9125c83c7df6541549fd46b46b0aad45694c4366391b8"} Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.292114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerStarted","Data":"ce56361e12c569a23e84a6e04c0dd4dc58a0bc0515af9f7f451bc94e772607ba"} Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.307599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"438b09cd-fd26-4ad4-a095-a63130a8e2f7","Type":"ContainerStarted","Data":"8aa9601c57067c257f81e2d5b1e61583cfebd23ed5f2326fdb664055f7308c10"} Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.309774 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8aa9601c57067c257f81e2d5b1e61583cfebd23ed5f2326fdb664055f7308c10" gracePeriod=30 Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.328713 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.328765581 podStartE2EDuration="7.328689678s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="2026-03-20 13:44:12.33335629 +0000 UTC m=+1351.843075256" lastFinishedPulling="2026-03-20 13:44:17.333280387 +0000 UTC m=+1356.842999353" observedRunningTime="2026-03-20 13:44:18.302269607 +0000 UTC m=+1357.811988573" watchObservedRunningTime="2026-03-20 13:44:18.328689678 +0000 UTC m=+1357.838408644" Mar 20 13:44:18 crc kubenswrapper[4895]: I0320 13:44:18.351327 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.3493300599999998 podStartE2EDuration="7.351308115s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="2026-03-20 13:44:13.330967204 +0000 UTC m=+1352.840686170" lastFinishedPulling="2026-03-20 13:44:17.332945259 +0000 UTC m=+1356.842664225" observedRunningTime="2026-03-20 13:44:18.343875013 +0000 UTC m=+1357.853593979" watchObservedRunningTime="2026-03-20 13:44:18.351308115 +0000 UTC m=+1357.861027081" Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.318096 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerStarted","Data":"f034a7c6eb7e642f60c29171ee52248b966d1d72e1280e9ab470217688384da6"} Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.320852 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerStarted","Data":"dde1795b5e49f890b5ac5235c4e010476fc3d4d0cb274397a9870e258e34ef45"} Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.321099 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-log" containerID="cri-o://57b2e886e9c279023fa9125c83c7df6541549fd46b46b0aad45694c4366391b8" gracePeriod=30 Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.321160 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-metadata" containerID="cri-o://dde1795b5e49f890b5ac5235c4e010476fc3d4d0cb274397a9870e258e34ef45" gracePeriod=30 Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.349878 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.656204371 podStartE2EDuration="8.349853143s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="2026-03-20 13:44:12.639633315 +0000 UTC m=+1352.149352281" lastFinishedPulling="2026-03-20 13:44:17.333282087 +0000 UTC m=+1356.843001053" observedRunningTime="2026-03-20 13:44:19.347817153 +0000 UTC m=+1358.857536119" watchObservedRunningTime="2026-03-20 13:44:19.349853143 +0000 UTC m=+1358.859572109" Mar 20 13:44:19 crc kubenswrapper[4895]: I0320 13:44:19.401818 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.400542286 podStartE2EDuration="8.401795943s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="2026-03-20 13:44:13.335601179 +0000 UTC m=+1352.845320145" lastFinishedPulling="2026-03-20 13:44:17.336854836 +0000 UTC m=+1356.846573802" observedRunningTime="2026-03-20 13:44:19.400712507 +0000 UTC m=+1358.910431473" watchObservedRunningTime="2026-03-20 13:44:19.401795943 +0000 UTC m=+1358.911514909" Mar 20 13:44:20 crc kubenswrapper[4895]: I0320 13:44:20.331439 4895 generic.go:334] "Generic (PLEG): container finished" podID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerID="dde1795b5e49f890b5ac5235c4e010476fc3d4d0cb274397a9870e258e34ef45" exitCode=0 Mar 20 13:44:20 crc kubenswrapper[4895]: I0320 13:44:20.331744 4895 generic.go:334] "Generic (PLEG): container finished" podID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerID="57b2e886e9c279023fa9125c83c7df6541549fd46b46b0aad45694c4366391b8" exitCode=143 Mar 20 13:44:20 crc kubenswrapper[4895]: I0320 13:44:20.331502 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerDied","Data":"dde1795b5e49f890b5ac5235c4e010476fc3d4d0cb274397a9870e258e34ef45"} Mar 20 13:44:20 crc kubenswrapper[4895]: I0320 13:44:20.331810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerDied","Data":"57b2e886e9c279023fa9125c83c7df6541549fd46b46b0aad45694c4366391b8"} Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.343957 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35447321-8c31-4ab1-b9a7-5ff3013b4811","Type":"ContainerDied","Data":"239e26060116c9d8d0047c2f4dfe1e12633f4eb73012a1dbb8ebc062e36d14ff"} Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.344222 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239e26060116c9d8d0047c2f4dfe1e12633f4eb73012a1dbb8ebc062e36d14ff" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.356587 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.415961 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data\") pod \"35447321-8c31-4ab1-b9a7-5ff3013b4811\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.416224 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdjx5\" (UniqueName: \"kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5\") pod \"35447321-8c31-4ab1-b9a7-5ff3013b4811\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.416314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs\") pod \"35447321-8c31-4ab1-b9a7-5ff3013b4811\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.416405 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle\") pod \"35447321-8c31-4ab1-b9a7-5ff3013b4811\" (UID: \"35447321-8c31-4ab1-b9a7-5ff3013b4811\") " Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.416604 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs" (OuterVolumeSpecName: "logs") pod "35447321-8c31-4ab1-b9a7-5ff3013b4811" (UID: "35447321-8c31-4ab1-b9a7-5ff3013b4811"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.417026 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35447321-8c31-4ab1-b9a7-5ff3013b4811-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.426605 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5" (OuterVolumeSpecName: "kube-api-access-rdjx5") pod "35447321-8c31-4ab1-b9a7-5ff3013b4811" (UID: "35447321-8c31-4ab1-b9a7-5ff3013b4811"). InnerVolumeSpecName "kube-api-access-rdjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.472502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data" (OuterVolumeSpecName: "config-data") pod "35447321-8c31-4ab1-b9a7-5ff3013b4811" (UID: "35447321-8c31-4ab1-b9a7-5ff3013b4811"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.518781 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.518816 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdjx5\" (UniqueName: \"kubernetes.io/projected/35447321-8c31-4ab1-b9a7-5ff3013b4811-kube-api-access-rdjx5\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.520524 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35447321-8c31-4ab1-b9a7-5ff3013b4811" (UID: "35447321-8c31-4ab1-b9a7-5ff3013b4811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.579989 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.580043 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.615212 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.621353 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35447321-8c31-4ab1-b9a7-5ff3013b4811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.797607 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.797664 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.902525 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.968540 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:44:21 crc kubenswrapper[4895]: I0320 13:44:21.968877 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="dnsmasq-dns" containerID="cri-o://d814b26ac164dcfecaa734ad8e39248ad6a76743820c119cac765e4d415f361e" gracePeriod=10 Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.008819 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.297074 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.297141 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.369555 4895 generic.go:334] "Generic (PLEG): container finished" podID="71bfd8c2-e6fb-408a-affd-75569329c598" containerID="d814b26ac164dcfecaa734ad8e39248ad6a76743820c119cac765e4d415f361e" exitCode=0 Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.369629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" event={"ID":"71bfd8c2-e6fb-408a-affd-75569329c598","Type":"ContainerDied","Data":"d814b26ac164dcfecaa734ad8e39248ad6a76743820c119cac765e4d415f361e"} Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.385074 4895 generic.go:334] "Generic (PLEG): container finished" podID="92cb01ad-b24f-4840-b7d8-6118730ac633" containerID="b85da950b26165d40e7a5917881f1c1c1993e9abc3472d57087bb6351ded117d" exitCode=0 Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.385176 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.385214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hnr4b" event={"ID":"92cb01ad-b24f-4840-b7d8-6118730ac633","Type":"ContainerDied","Data":"b85da950b26165d40e7a5917881f1c1c1993e9abc3472d57087bb6351ded117d"} Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.452633 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.478502 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.483145 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.541789 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:22 crc kubenswrapper[4895]: E0320 13:44:22.542276 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-log" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.542293 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-log" Mar 20 13:44:22 crc kubenswrapper[4895]: E0320 13:44:22.542330 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-metadata" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.542335 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-metadata" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.542560 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-log" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.542579 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" containerName="nova-metadata-metadata" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.543656 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.545765 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.549266 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.577668 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.657778 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.657823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mz2\" (UniqueName: \"kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.657861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.657988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.658005 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.759491 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.759538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.759614 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.759631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mz2\" (UniqueName: \"kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.759668 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.760091 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.765934 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.766582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.773908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.781991 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mz2\" (UniqueName: \"kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2\") pod \"nova-metadata-0\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " pod="openstack/nova-metadata-0" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.882601 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.882961 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:22 crc kubenswrapper[4895]: I0320 13:44:22.913370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.016795 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168121 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168241 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168268 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168297 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7c7\" (UniqueName: \"kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.168445 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc\") pod \"71bfd8c2-e6fb-408a-affd-75569329c598\" (UID: \"71bfd8c2-e6fb-408a-affd-75569329c598\") " Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.231654 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7" (OuterVolumeSpecName: "kube-api-access-jn7c7") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "kube-api-access-jn7c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.254960 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35447321-8c31-4ab1-b9a7-5ff3013b4811" path="/var/lib/kubelet/pods/35447321-8c31-4ab1-b9a7-5ff3013b4811/volumes" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.275736 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7c7\" (UniqueName: \"kubernetes.io/projected/71bfd8c2-e6fb-408a-affd-75569329c598-kube-api-access-jn7c7\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.306434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.308596 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.318339 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config" (OuterVolumeSpecName: "config") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.333302 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.378241 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.378268 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.378277 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.378285 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.387976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71bfd8c2-e6fb-408a-affd-75569329c598" (UID: "71bfd8c2-e6fb-408a-affd-75569329c598"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.418187 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.418483 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bd69657f-z9tgz" event={"ID":"71bfd8c2-e6fb-408a-affd-75569329c598","Type":"ContainerDied","Data":"28aefa9b85bbe92d279062b66c685753fca74745e82e5971a7c5161b28f53e49"} Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.418546 4895 scope.go:117] "RemoveContainer" containerID="d814b26ac164dcfecaa734ad8e39248ad6a76743820c119cac765e4d415f361e" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.479873 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71bfd8c2-e6fb-408a-affd-75569329c598-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.494823 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.507047 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bd69657f-z9tgz"] Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.508793 4895 scope.go:117] "RemoveContainer" containerID="1cc0da441010671eb68be61727574a7b05ee812152dedad32c8ad3962ee196aa" Mar 20 13:44:23 crc kubenswrapper[4895]: E0320 13:44:23.727702 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71bfd8c2_e6fb_408a_affd_75569329c598.slice/crio-28aefa9b85bbe92d279062b66c685753fca74745e82e5971a7c5161b28f53e49\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71bfd8c2_e6fb_408a_affd_75569329c598.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:23 crc kubenswrapper[4895]: I0320 13:44:23.964685 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.423507 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.433700 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hnr4b" event={"ID":"92cb01ad-b24f-4840-b7d8-6118730ac633","Type":"ContainerDied","Data":"24d72c8949b71d5af60b856a09bfa1dc38c0c105e62cba0bd7527a3185a5b27b"} Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.433754 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d72c8949b71d5af60b856a09bfa1dc38c0c105e62cba0bd7527a3185a5b27b" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.433865 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hnr4b" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.436122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerStarted","Data":"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb"} Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.436176 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerStarted","Data":"87db15c59393ad44c9bf124c316f5f53b7aedd3a71d944900bfb035a742d290c"} Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.617091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle\") pod \"92cb01ad-b24f-4840-b7d8-6118730ac633\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.617267 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdtf5\" (UniqueName: \"kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5\") pod \"92cb01ad-b24f-4840-b7d8-6118730ac633\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.617300 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts\") pod \"92cb01ad-b24f-4840-b7d8-6118730ac633\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.617431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data\") pod \"92cb01ad-b24f-4840-b7d8-6118730ac633\" (UID: \"92cb01ad-b24f-4840-b7d8-6118730ac633\") " Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.623998 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts" (OuterVolumeSpecName: "scripts") pod "92cb01ad-b24f-4840-b7d8-6118730ac633" (UID: "92cb01ad-b24f-4840-b7d8-6118730ac633"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.634119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5" (OuterVolumeSpecName: "kube-api-access-mdtf5") pod "92cb01ad-b24f-4840-b7d8-6118730ac633" (UID: "92cb01ad-b24f-4840-b7d8-6118730ac633"). InnerVolumeSpecName "kube-api-access-mdtf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.664905 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92cb01ad-b24f-4840-b7d8-6118730ac633" (UID: "92cb01ad-b24f-4840-b7d8-6118730ac633"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.692583 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data" (OuterVolumeSpecName: "config-data") pod "92cb01ad-b24f-4840-b7d8-6118730ac633" (UID: "92cb01ad-b24f-4840-b7d8-6118730ac633"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.719443 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.719484 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.719495 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdtf5\" (UniqueName: \"kubernetes.io/projected/92cb01ad-b24f-4840-b7d8-6118730ac633-kube-api-access-mdtf5\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4895]: I0320 13:44:24.719504 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92cb01ad-b24f-4840-b7d8-6118730ac633-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.224178 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" path="/var/lib/kubelet/pods/71bfd8c2-e6fb-408a-affd-75569329c598/volumes" Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.453448 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerStarted","Data":"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38"} Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.486733 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.486711728 podStartE2EDuration="3.486711728s" podCreationTimestamp="2026-03-20 13:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:25.472920918 +0000 UTC m=+1364.982639884" watchObservedRunningTime="2026-03-20 13:44:25.486711728 +0000 UTC m=+1364.996430694" Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.672970 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.673245 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-log" containerID="cri-o://ce56361e12c569a23e84a6e04c0dd4dc58a0bc0515af9f7f451bc94e772607ba" gracePeriod=30 Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.673305 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-api" containerID="cri-o://f034a7c6eb7e642f60c29171ee52248b966d1d72e1280e9ab470217688384da6" gracePeriod=30 Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.691006 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.691281 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2bbda25c-946b-497c-afef-622259be6557" containerName="nova-scheduler-scheduler" containerID="cri-o://b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" gracePeriod=30 Mar 20 13:44:25 crc kubenswrapper[4895]: I0320 13:44:25.705014 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.018155 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.018364 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9f27bbad-8a84-4902-8349-8c8724552442" containerName="kube-state-metrics" containerID="cri-o://27c97374a7acc4bbb9412bcb712b4beb51a28c4c64a1d5a4de424a262ffdba2f" gracePeriod=30 Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.466734 4895 generic.go:334] "Generic (PLEG): container finished" podID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerID="ce56361e12c569a23e84a6e04c0dd4dc58a0bc0515af9f7f451bc94e772607ba" exitCode=143 Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.466802 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerDied","Data":"ce56361e12c569a23e84a6e04c0dd4dc58a0bc0515af9f7f451bc94e772607ba"} Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.468537 4895 generic.go:334] "Generic (PLEG): container finished" podID="9f27bbad-8a84-4902-8349-8c8724552442" containerID="27c97374a7acc4bbb9412bcb712b4beb51a28c4c64a1d5a4de424a262ffdba2f" exitCode=2 Mar 20 13:44:26 crc kubenswrapper[4895]: I0320 13:44:26.469491 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f27bbad-8a84-4902-8349-8c8724552442","Type":"ContainerDied","Data":"27c97374a7acc4bbb9412bcb712b4beb51a28c4c64a1d5a4de424a262ffdba2f"} Mar 20 13:44:26 crc kubenswrapper[4895]: E0320 13:44:26.580617 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01 is running failed: container process not found" containerID="b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:44:26 crc kubenswrapper[4895]: E0320 13:44:26.580965 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01 is running failed: container process not found" containerID="b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:44:26 crc kubenswrapper[4895]: E0320 13:44:26.581212 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01 is running failed: container process not found" containerID="b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:44:26 crc kubenswrapper[4895]: E0320 13:44:26.581242 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2bbda25c-946b-497c-afef-622259be6557" containerName="nova-scheduler-scheduler" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.491194 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bbda25c-946b-497c-afef-622259be6557" containerID="b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" exitCode=0 Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.491479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bbda25c-946b-497c-afef-622259be6557","Type":"ContainerDied","Data":"b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01"} Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.491506 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bbda25c-946b-497c-afef-622259be6557","Type":"ContainerDied","Data":"a3287f3a0e81ccb7e84aa8a4e801a0e4e4235d39b31893bbfe83494f33cd90ba"} Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.491516 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3287f3a0e81ccb7e84aa8a4e801a0e4e4235d39b31893bbfe83494f33cd90ba" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.493634 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f27bbad-8a84-4902-8349-8c8724552442","Type":"ContainerDied","Data":"6bd465cd027abf03f28efae2a3be37dab0a7fdd76d5d0099d41fbfe8fb34736b"} Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.493686 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bd465cd027abf03f28efae2a3be37dab0a7fdd76d5d0099d41fbfe8fb34736b" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.495184 4895 generic.go:334] "Generic (PLEG): container finished" podID="23e0421a-d787-4327-b7bf-b4f974871690" containerID="c09da54244a40e429af0155a9927e13121445bcdc6fc0a3d2259bb590d5d6e30" exitCode=0 Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.495268 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" event={"ID":"23e0421a-d787-4327-b7bf-b4f974871690","Type":"ContainerDied","Data":"c09da54244a40e429af0155a9927e13121445bcdc6fc0a3d2259bb590d5d6e30"} Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.495363 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-log" containerID="cri-o://cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" gracePeriod=30 Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.495472 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-metadata" containerID="cri-o://bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" gracePeriod=30 Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.548732 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.556845 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.690086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzfl6\" (UniqueName: \"kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6\") pod \"2bbda25c-946b-497c-afef-622259be6557\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.690229 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle\") pod \"2bbda25c-946b-497c-afef-622259be6557\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.690341 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhb6v\" (UniqueName: \"kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v\") pod \"9f27bbad-8a84-4902-8349-8c8724552442\" (UID: \"9f27bbad-8a84-4902-8349-8c8724552442\") " Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.690657 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data\") pod \"2bbda25c-946b-497c-afef-622259be6557\" (UID: \"2bbda25c-946b-497c-afef-622259be6557\") " Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.696133 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v" (OuterVolumeSpecName: "kube-api-access-qhb6v") pod "9f27bbad-8a84-4902-8349-8c8724552442" (UID: "9f27bbad-8a84-4902-8349-8c8724552442"). InnerVolumeSpecName "kube-api-access-qhb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.714605 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6" (OuterVolumeSpecName: "kube-api-access-nzfl6") pod "2bbda25c-946b-497c-afef-622259be6557" (UID: "2bbda25c-946b-497c-afef-622259be6557"). InnerVolumeSpecName "kube-api-access-nzfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.740568 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data" (OuterVolumeSpecName: "config-data") pod "2bbda25c-946b-497c-afef-622259be6557" (UID: "2bbda25c-946b-497c-afef-622259be6557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.768527 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bbda25c-946b-497c-afef-622259be6557" (UID: "2bbda25c-946b-497c-afef-622259be6557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.778433 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:44:27 crc kubenswrapper[4895]: E0320 13:44:27.778884 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="dnsmasq-dns" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.778900 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="dnsmasq-dns" Mar 20 13:44:27 crc kubenswrapper[4895]: E0320 13:44:27.778922 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f27bbad-8a84-4902-8349-8c8724552442" containerName="kube-state-metrics" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.778928 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f27bbad-8a84-4902-8349-8c8724552442" containerName="kube-state-metrics" Mar 20 13:44:27 crc kubenswrapper[4895]: E0320 13:44:27.778960 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92cb01ad-b24f-4840-b7d8-6118730ac633" containerName="nova-manage" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.778967 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="92cb01ad-b24f-4840-b7d8-6118730ac633" containerName="nova-manage" Mar 20 13:44:27 crc kubenswrapper[4895]: E0320 13:44:27.778976 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="init" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.778981 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="init" Mar 20 13:44:27 crc kubenswrapper[4895]: E0320 13:44:27.778998 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbda25c-946b-497c-afef-622259be6557" containerName="nova-scheduler-scheduler" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.779003 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbda25c-946b-497c-afef-622259be6557" containerName="nova-scheduler-scheduler" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.779198 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f27bbad-8a84-4902-8349-8c8724552442" containerName="kube-state-metrics" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.779215 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bfd8c2-e6fb-408a-affd-75569329c598" containerName="dnsmasq-dns" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.779229 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbda25c-946b-497c-afef-622259be6557" containerName="nova-scheduler-scheduler" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.779240 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="92cb01ad-b24f-4840-b7d8-6118730ac633" containerName="nova-manage" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.780645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.796637 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzfl6\" (UniqueName: \"kubernetes.io/projected/2bbda25c-946b-497c-afef-622259be6557-kube-api-access-nzfl6\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.796676 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.796689 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhb6v\" (UniqueName: \"kubernetes.io/projected/9f27bbad-8a84-4902-8349-8c8724552442-kube-api-access-qhb6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.796705 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bbda25c-946b-497c-afef-622259be6557-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.821676 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.898165 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.898256 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.898286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2mx\" (UniqueName: \"kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:27 crc kubenswrapper[4895]: I0320 13:44:27.999574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:27.999886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:27.999919 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2mx\" (UniqueName: \"kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.000618 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.000846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.020282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2mx\" (UniqueName: \"kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx\") pod \"redhat-operators-226rz\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.152972 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.412788 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.515934 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4mz2\" (UniqueName: \"kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2\") pod \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.515998 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs\") pod \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.516134 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle\") pod \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.516198 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs\") pod \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.516305 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data\") pod \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\" (UID: \"326dc8dd-88ca-4dc0-8bd0-3f01cc636418\") " Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.517286 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs" (OuterVolumeSpecName: "logs") pod "326dc8dd-88ca-4dc0-8bd0-3f01cc636418" (UID: "326dc8dd-88ca-4dc0-8bd0-3f01cc636418"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.521533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2" (OuterVolumeSpecName: "kube-api-access-v4mz2") pod "326dc8dd-88ca-4dc0-8bd0-3f01cc636418" (UID: "326dc8dd-88ca-4dc0-8bd0-3f01cc636418"). InnerVolumeSpecName "kube-api-access-v4mz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.521799 4895 generic.go:334] "Generic (PLEG): container finished" podID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerID="bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" exitCode=0 Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.521833 4895 generic.go:334] "Generic (PLEG): container finished" podID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerID="cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" exitCode=143 Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.522050 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523068 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerDied","Data":"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38"} Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerDied","Data":"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb"} Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"326dc8dd-88ca-4dc0-8bd0-3f01cc636418","Type":"ContainerDied","Data":"87db15c59393ad44c9bf124c316f5f53b7aedd3a71d944900bfb035a742d290c"} Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523153 4895 scope.go:117] "RemoveContainer" containerID="bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523259 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.523272 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.575988 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.577567 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data" (OuterVolumeSpecName: "config-data") pod "326dc8dd-88ca-4dc0-8bd0-3f01cc636418" (UID: "326dc8dd-88ca-4dc0-8bd0-3f01cc636418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.581057 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "326dc8dd-88ca-4dc0-8bd0-3f01cc636418" (UID: "326dc8dd-88ca-4dc0-8bd0-3f01cc636418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.602423 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.620551 4895 scope.go:117] "RemoveContainer" containerID="cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.621615 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.621649 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4mz2\" (UniqueName: \"kubernetes.io/projected/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-kube-api-access-v4mz2\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.621659 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.621668 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.629686 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.672894 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.683765 4895 scope.go:117] "RemoveContainer" containerID="bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" Mar 20 13:44:28 crc kubenswrapper[4895]: E0320 13:44:28.684862 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38\": container with ID starting with bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38 not found: ID does not exist" containerID="bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.684896 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38"} err="failed to get container status \"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38\": rpc error: code = NotFound desc = could not find container \"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38\": container with ID starting with bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38 not found: ID does not exist" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.684920 4895 scope.go:117] "RemoveContainer" containerID="cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" Mar 20 13:44:28 crc kubenswrapper[4895]: E0320 13:44:28.685641 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb\": container with ID starting with cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb not found: ID does not exist" containerID="cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.685658 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb"} err="failed to get container status \"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb\": rpc error: code = NotFound desc = could not find container \"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb\": container with ID starting with cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb not found: ID does not exist" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.685670 4895 scope.go:117] "RemoveContainer" containerID="bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.689774 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38"} err="failed to get container status \"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38\": rpc error: code = NotFound desc = could not find container \"bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38\": container with ID starting with bbb96f855ca60bdcb1405c34482b78b8bfebf60eef34bde7f626f2ba01be6f38 not found: ID does not exist" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.689799 4895 scope.go:117] "RemoveContainer" containerID="cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.690049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "326dc8dd-88ca-4dc0-8bd0-3f01cc636418" (UID: "326dc8dd-88ca-4dc0-8bd0-3f01cc636418"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.692786 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.693600 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb"} err="failed to get container status \"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb\": rpc error: code = NotFound desc = could not find container \"cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb\": container with ID starting with cc38f4cc683bcc65cb7ad5e7c12a6cd5d0c626d5bc4c6daf690841309efda7fb not found: ID does not exist" Mar 20 13:44:28 crc kubenswrapper[4895]: E0320 13:44:28.695317 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-metadata" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.695344 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-metadata" Mar 20 13:44:28 crc kubenswrapper[4895]: E0320 13:44:28.695359 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-log" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.695365 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-log" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.695621 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-log" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.695637 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" containerName="nova-metadata-metadata" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.696534 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.699109 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.702993 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.715748 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.718356 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.724588 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.726795 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.730031 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/326dc8dd-88ca-4dc0-8bd0-3f01cc636418-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.734798 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.836737 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837072 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4hg\" (UniqueName: \"kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837256 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txpl6\" (UniqueName: \"kubernetes.io/projected/c239ab6f-e370-422d-8af1-dff391b88461-kube-api-access-txpl6\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837417 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.837485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.923490 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.939897 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txpl6\" (UniqueName: \"kubernetes.io/projected/c239ab6f-e370-422d-8af1-dff391b88461-kube-api-access-txpl6\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.939996 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.940042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.940112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.940141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.940193 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4hg\" (UniqueName: \"kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.940263 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.945665 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.953101 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.953179 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.955118 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.957260 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/c239ab6f-e370-422d-8af1-dff391b88461-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.967246 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.994149 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:28 crc kubenswrapper[4895]: I0320 13:44:28.995973 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.002802 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.003050 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.017165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txpl6\" (UniqueName: \"kubernetes.io/projected/c239ab6f-e370-422d-8af1-dff391b88461-kube-api-access-txpl6\") pod \"kube-state-metrics-0\" (UID: \"c239ab6f-e370-422d-8af1-dff391b88461\") " pod="openstack/kube-state-metrics-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.026305 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4hg\" (UniqueName: \"kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg\") pod \"nova-scheduler-0\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " pod="openstack/nova-scheduler-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.049237 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.128144 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.142126 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.156776 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.156823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdpj\" (UniqueName: \"kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.156847 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.156882 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.156951 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.262818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdpj\" (UniqueName: \"kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.262878 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.262936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.263088 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.263260 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.264889 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.278964 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.294330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.296296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.301270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdpj\" (UniqueName: \"kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj\") pod \"nova-metadata-0\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.341837 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbda25c-946b-497c-afef-622259be6557" path="/var/lib/kubelet/pods/2bbda25c-946b-497c-afef-622259be6557/volumes" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.342980 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="326dc8dd-88ca-4dc0-8bd0-3f01cc636418" path="/var/lib/kubelet/pods/326dc8dd-88ca-4dc0-8bd0-3f01cc636418/volumes" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.343578 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f27bbad-8a84-4902-8349-8c8724552442" path="/var/lib/kubelet/pods/9f27bbad-8a84-4902-8349-8c8724552442/volumes" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.344151 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:44:29 crc kubenswrapper[4895]: W0320 13:44:29.344890 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0efc230_d08e_425c_87cb_96b47fa4474a.slice/crio-eefdf3977d6f81197a3cc478dcf7eba91ab43712088a4eaf55a71096d05f0acc WatchSource:0}: Error finding container eefdf3977d6f81197a3cc478dcf7eba91ab43712088a4eaf55a71096d05f0acc: Status 404 returned error can't find the container with id eefdf3977d6f81197a3cc478dcf7eba91ab43712088a4eaf55a71096d05f0acc Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.470873 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.570442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerStarted","Data":"eefdf3977d6f81197a3cc478dcf7eba91ab43712088a4eaf55a71096d05f0acc"} Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.669720 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.790471 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data\") pod \"23e0421a-d787-4327-b7bf-b4f974871690\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.790596 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4j2\" (UniqueName: \"kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2\") pod \"23e0421a-d787-4327-b7bf-b4f974871690\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.790707 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle\") pod \"23e0421a-d787-4327-b7bf-b4f974871690\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.791378 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts\") pod \"23e0421a-d787-4327-b7bf-b4f974871690\" (UID: \"23e0421a-d787-4327-b7bf-b4f974871690\") " Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.797606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.797673 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.803538 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2" (OuterVolumeSpecName: "kube-api-access-pd4j2") pod "23e0421a-d787-4327-b7bf-b4f974871690" (UID: "23e0421a-d787-4327-b7bf-b4f974871690"). InnerVolumeSpecName "kube-api-access-pd4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.807529 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts" (OuterVolumeSpecName: "scripts") pod "23e0421a-d787-4327-b7bf-b4f974871690" (UID: "23e0421a-d787-4327-b7bf-b4f974871690"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.821863 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.822364 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-central-agent" containerID="cri-o://682f274553a20375f129c760c6bdd471b988822bc84ea29b332969845985d7b0" gracePeriod=30 Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.825466 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-notification-agent" containerID="cri-o://2b74cbd2a13b39a194822da40aae23e8a89f8475d1cb598ba7c73a080ba9aae0" gracePeriod=30 Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.825582 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="proxy-httpd" containerID="cri-o://a83cc59874e33484146334881b4d4c84f236229de64fc23224b06cc64a03c8b9" gracePeriod=30 Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.825617 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="sg-core" containerID="cri-o://5ee9774ceccc6ac6611b1e14382e6f044426d251a6babd1fc6ce66e4fd3eca91" gracePeriod=30 Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.891444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23e0421a-d787-4327-b7bf-b4f974871690" (UID: "23e0421a-d787-4327-b7bf-b4f974871690"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.891578 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data" (OuterVolumeSpecName: "config-data") pod "23e0421a-d787-4327-b7bf-b4f974871690" (UID: "23e0421a-d787-4327-b7bf-b4f974871690"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.893898 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.894050 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4j2\" (UniqueName: \"kubernetes.io/projected/23e0421a-d787-4327-b7bf-b4f974871690-kube-api-access-pd4j2\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.894173 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.894255 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0421a-d787-4327-b7bf-b4f974871690-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:29 crc kubenswrapper[4895]: I0320 13:44:29.895763 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.103363 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:44:30 crc kubenswrapper[4895]: W0320 13:44:30.104464 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc239ab6f_e370_422d_8af1_dff391b88461.slice/crio-12f87e032617dfe05a556bc8e7d8b22bcb7eeeee8d24fb7134d17e7faeb5b2df WatchSource:0}: Error finding container 12f87e032617dfe05a556bc8e7d8b22bcb7eeeee8d24fb7134d17e7faeb5b2df: Status 404 returned error can't find the container with id 12f87e032617dfe05a556bc8e7d8b22bcb7eeeee8d24fb7134d17e7faeb5b2df Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.402714 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:44:30 crc kubenswrapper[4895]: W0320 13:44:30.407643 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0fde814_3ccb_4c95_915d_fa586ca8a578.slice/crio-e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee WatchSource:0}: Error finding container e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee: Status 404 returned error can't find the container with id e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.585213 4895 generic.go:334] "Generic (PLEG): container finished" podID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerID="009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c" exitCode=0 Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.585452 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerDied","Data":"009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.618716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3854bcd-8336-4aac-94e3-1b48dbef874e","Type":"ContainerStarted","Data":"a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.618760 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3854bcd-8336-4aac-94e3-1b48dbef874e","Type":"ContainerStarted","Data":"9201f3752ca4224daea43a5e454c5c9e4cc3e1d8b841989a6208b376bccf2f6c"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632292 4895 generic.go:334] "Generic (PLEG): container finished" podID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerID="a83cc59874e33484146334881b4d4c84f236229de64fc23224b06cc64a03c8b9" exitCode=0 Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632322 4895 generic.go:334] "Generic (PLEG): container finished" podID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerID="5ee9774ceccc6ac6611b1e14382e6f044426d251a6babd1fc6ce66e4fd3eca91" exitCode=2 Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632332 4895 generic.go:334] "Generic (PLEG): container finished" podID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerID="682f274553a20375f129c760c6bdd471b988822bc84ea29b332969845985d7b0" exitCode=0 Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632370 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerDied","Data":"a83cc59874e33484146334881b4d4c84f236229de64fc23224b06cc64a03c8b9"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632410 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerDied","Data":"5ee9774ceccc6ac6611b1e14382e6f044426d251a6babd1fc6ce66e4fd3eca91"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.632421 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerDied","Data":"682f274553a20375f129c760c6bdd471b988822bc84ea29b332969845985d7b0"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.637797 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c239ab6f-e370-422d-8af1-dff391b88461","Type":"ContainerStarted","Data":"12f87e032617dfe05a556bc8e7d8b22bcb7eeeee8d24fb7134d17e7faeb5b2df"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.667471 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerStarted","Data":"e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.667603 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.667591374 podStartE2EDuration="2.667591374s" podCreationTimestamp="2026-03-20 13:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:30.646808262 +0000 UTC m=+1370.156527228" watchObservedRunningTime="2026-03-20 13:44:30.667591374 +0000 UTC m=+1370.177310340" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.677784 4895 generic.go:334] "Generic (PLEG): container finished" podID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerID="f034a7c6eb7e642f60c29171ee52248b966d1d72e1280e9ab470217688384da6" exitCode=0 Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.677875 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerDied","Data":"f034a7c6eb7e642f60c29171ee52248b966d1d72e1280e9ab470217688384da6"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.680244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" event={"ID":"23e0421a-d787-4327-b7bf-b4f974871690","Type":"ContainerDied","Data":"2eb0b9c504fb5df63dba7cd715bc4be968cb4151c188362852f6f72daed58d2b"} Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.680270 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb0b9c504fb5df63dba7cd715bc4be968cb4151c188362852f6f72daed58d2b" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.680318 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6sw4l" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.792294 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:44:30 crc kubenswrapper[4895]: E0320 13:44:30.792777 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e0421a-d787-4327-b7bf-b4f974871690" containerName="nova-cell1-conductor-db-sync" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.792798 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e0421a-d787-4327-b7bf-b4f974871690" containerName="nova-cell1-conductor-db-sync" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.793005 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e0421a-d787-4327-b7bf-b4f974871690" containerName="nova-cell1-conductor-db-sync" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.795235 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.800668 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.813109 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.927073 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.927157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frhf\" (UniqueName: \"kubernetes.io/projected/58722a98-11a8-4e98-8185-82f18acd6718-kube-api-access-5frhf\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.927202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:30 crc kubenswrapper[4895]: I0320 13:44:30.994407 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.029262 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.029351 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frhf\" (UniqueName: \"kubernetes.io/projected/58722a98-11a8-4e98-8185-82f18acd6718-kube-api-access-5frhf\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.029409 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.034150 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.037336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58722a98-11a8-4e98-8185-82f18acd6718-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.051932 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frhf\" (UniqueName: \"kubernetes.io/projected/58722a98-11a8-4e98-8185-82f18acd6718-kube-api-access-5frhf\") pod \"nova-cell1-conductor-0\" (UID: \"58722a98-11a8-4e98-8185-82f18acd6718\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.130431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle\") pod \"72be5ceb-9b35-4247-9ced-64d70bf674d3\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.130728 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs\") pod \"72be5ceb-9b35-4247-9ced-64d70bf674d3\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.130866 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55d78\" (UniqueName: \"kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78\") pod \"72be5ceb-9b35-4247-9ced-64d70bf674d3\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.130971 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data\") pod \"72be5ceb-9b35-4247-9ced-64d70bf674d3\" (UID: \"72be5ceb-9b35-4247-9ced-64d70bf674d3\") " Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.131677 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs" (OuterVolumeSpecName: "logs") pod "72be5ceb-9b35-4247-9ced-64d70bf674d3" (UID: "72be5ceb-9b35-4247-9ced-64d70bf674d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.137168 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78" (OuterVolumeSpecName: "kube-api-access-55d78") pod "72be5ceb-9b35-4247-9ced-64d70bf674d3" (UID: "72be5ceb-9b35-4247-9ced-64d70bf674d3"). InnerVolumeSpecName "kube-api-access-55d78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.159606 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72be5ceb-9b35-4247-9ced-64d70bf674d3" (UID: "72be5ceb-9b35-4247-9ced-64d70bf674d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.173105 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data" (OuterVolumeSpecName: "config-data") pod "72be5ceb-9b35-4247-9ced-64d70bf674d3" (UID: "72be5ceb-9b35-4247-9ced-64d70bf674d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.233599 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.233862 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72be5ceb-9b35-4247-9ced-64d70bf674d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.233938 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72be5ceb-9b35-4247-9ced-64d70bf674d3-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.234010 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55d78\" (UniqueName: \"kubernetes.io/projected/72be5ceb-9b35-4247-9ced-64d70bf674d3-kube-api-access-55d78\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.287030 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.716445 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c239ab6f-e370-422d-8af1-dff391b88461","Type":"ContainerStarted","Data":"110cfa67883bea120e8904cfc63cee220ddd12814000df7759d54149405f063a"} Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.718607 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.734872 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerStarted","Data":"af1352100d2092d7137fe012465a11f10fd84f26bba6920925bdadc92648b206"} Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.734919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerStarted","Data":"09879d3e625a0766e94659b2099c44ac187b1969ae5b9414269da3bb7c9b8950"} Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.762408 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.372077897 podStartE2EDuration="3.762373892s" podCreationTimestamp="2026-03-20 13:44:28 +0000 UTC" firstStartedPulling="2026-03-20 13:44:30.116647521 +0000 UTC m=+1369.626366487" lastFinishedPulling="2026-03-20 13:44:30.506943516 +0000 UTC m=+1370.016662482" observedRunningTime="2026-03-20 13:44:31.741003845 +0000 UTC m=+1371.250722811" watchObservedRunningTime="2026-03-20 13:44:31.762373892 +0000 UTC m=+1371.272092858" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.775440 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.775702 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72be5ceb-9b35-4247-9ced-64d70bf674d3","Type":"ContainerDied","Data":"a8d6b2310a267f59a7353242934889d108295e103dddd647345ed33aa734058c"} Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.775768 4895 scope.go:117] "RemoveContainer" containerID="f034a7c6eb7e642f60c29171ee52248b966d1d72e1280e9ab470217688384da6" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.781871 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.7816505769999997 podStartE2EDuration="3.781650577s" podCreationTimestamp="2026-03-20 13:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:31.772815919 +0000 UTC m=+1371.282534885" watchObservedRunningTime="2026-03-20 13:44:31.781650577 +0000 UTC m=+1371.291369543" Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.786172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerStarted","Data":"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086"} Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.805585 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:44:31 crc kubenswrapper[4895]: I0320 13:44:31.977430 4895 scope.go:117] "RemoveContainer" containerID="ce56361e12c569a23e84a6e04c0dd4dc58a0bc0515af9f7f451bc94e772607ba" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.003103 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.017942 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.029552 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:32 crc kubenswrapper[4895]: E0320 13:44:32.029953 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-api" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.029970 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-api" Mar 20 13:44:32 crc kubenswrapper[4895]: E0320 13:44:32.029994 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-log" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.030000 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-log" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.030178 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-api" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.030202 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" containerName="nova-api-log" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.031270 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.036205 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.046294 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.161349 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.161430 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.161638 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.161681 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfkkz\" (UniqueName: \"kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.263370 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.263463 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfkkz\" (UniqueName: \"kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.263549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.263586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.264048 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.277076 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.280765 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.286956 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfkkz\" (UniqueName: \"kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz\") pod \"nova-api-0\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.348219 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.829813 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"58722a98-11a8-4e98-8185-82f18acd6718","Type":"ContainerStarted","Data":"a9b623f8b85e2db979e02debbc2e17458c8c21ac452f23d70725e0259feb9f24"} Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.830075 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"58722a98-11a8-4e98-8185-82f18acd6718","Type":"ContainerStarted","Data":"f6927009a379bccee07519f3d1b04df8bfb34334a402d700812225b354d7c861"} Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.850712 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8506941919999997 podStartE2EDuration="2.850694192s" podCreationTimestamp="2026-03-20 13:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:32.846688313 +0000 UTC m=+1372.356407279" watchObservedRunningTime="2026-03-20 13:44:32.850694192 +0000 UTC m=+1372.360413148" Mar 20 13:44:32 crc kubenswrapper[4895]: I0320 13:44:32.923431 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.230655 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72be5ceb-9b35-4247-9ced-64d70bf674d3" path="/var/lib/kubelet/pods/72be5ceb-9b35-4247-9ced-64d70bf674d3/volumes" Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.842074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerStarted","Data":"3b37f8a33382926f888cb81376135ce08ee0ea083ac13984244f84562db2adf1"} Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.842429 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerStarted","Data":"e4f1ff0406fcff5005682e9cb3e1e1d4f08fc2fe98cf9ffa41b1dfd7321f129e"} Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.842442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerStarted","Data":"d1622500898ecb3d45a71aa3f6b058fd5bf13c34d4e8514767e12f9a90393ef3"} Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.844766 4895 generic.go:334] "Generic (PLEG): container finished" podID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerID="2b74cbd2a13b39a194822da40aae23e8a89f8475d1cb598ba7c73a080ba9aae0" exitCode=0 Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.845159 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerDied","Data":"2b74cbd2a13b39a194822da40aae23e8a89f8475d1cb598ba7c73a080ba9aae0"} Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.845206 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:33 crc kubenswrapper[4895]: I0320 13:44:33.875973 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.875947267 podStartE2EDuration="2.875947267s" podCreationTimestamp="2026-03-20 13:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:33.860368054 +0000 UTC m=+1373.370087020" watchObservedRunningTime="2026-03-20 13:44:33.875947267 +0000 UTC m=+1373.385666243" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.128845 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.804560 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.856854 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d370eb51-2043-498f-b94a-11ac6f56f65f","Type":"ContainerDied","Data":"9597a2c35a2d6fbe65c036a877e2bad0ddacbaa5b56726f143b2b86539b01b2b"} Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.856863 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.856925 4895 scope.go:117] "RemoveContainer" containerID="a83cc59874e33484146334881b4d4c84f236229de64fc23224b06cc64a03c8b9" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.876642 4895 scope.go:117] "RemoveContainer" containerID="5ee9774ceccc6ac6611b1e14382e6f044426d251a6babd1fc6ce66e4fd3eca91" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.900541 4895 scope.go:117] "RemoveContainer" containerID="2b74cbd2a13b39a194822da40aae23e8a89f8475d1cb598ba7c73a080ba9aae0" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.920678 4895 scope.go:117] "RemoveContainer" containerID="682f274553a20375f129c760c6bdd471b988822bc84ea29b332969845985d7b0" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.925078 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.925206 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.927536 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.933553 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxsx\" (UniqueName: \"kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.934191 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.934225 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.934267 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd\") pod \"d370eb51-2043-498f-b94a-11ac6f56f65f\" (UID: \"d370eb51-2043-498f-b94a-11ac6f56f65f\") " Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.937144 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.937360 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.937950 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx" (OuterVolumeSpecName: "kube-api-access-5dxsx") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "kube-api-access-5dxsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.938362 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts" (OuterVolumeSpecName: "scripts") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:34 crc kubenswrapper[4895]: I0320 13:44:34.973512 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.034161 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036521 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036547 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dxsx\" (UniqueName: \"kubernetes.io/projected/d370eb51-2043-498f-b94a-11ac6f56f65f-kube-api-access-5dxsx\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036558 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036567 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036575 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d370eb51-2043-498f-b94a-11ac6f56f65f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.036582 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.089922 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data" (OuterVolumeSpecName: "config-data") pod "d370eb51-2043-498f-b94a-11ac6f56f65f" (UID: "d370eb51-2043-498f-b94a-11ac6f56f65f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.138227 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d370eb51-2043-498f-b94a-11ac6f56f65f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.204997 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.235705 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243030 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:35 crc kubenswrapper[4895]: E0320 13:44:35.243450 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="proxy-httpd" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243464 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="proxy-httpd" Mar 20 13:44:35 crc kubenswrapper[4895]: E0320 13:44:35.243484 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-notification-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243492 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-notification-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: E0320 13:44:35.243509 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="sg-core" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243515 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="sg-core" Mar 20 13:44:35 crc kubenswrapper[4895]: E0320 13:44:35.243532 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-central-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243546 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-central-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243717 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-central-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243730 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="sg-core" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243736 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="ceilometer-notification-agent" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.243747 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" containerName="proxy-httpd" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.245610 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.248895 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.249078 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.249235 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.257917 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.341929 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342050 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342128 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnbw\" (UniqueName: \"kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342219 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342313 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.342364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444189 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444294 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444319 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnbw\" (UniqueName: \"kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444451 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.444469 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.445187 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.445352 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.448348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.448807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.449786 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.450425 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.451070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.463094 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnbw\" (UniqueName: \"kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw\") pod \"ceilometer-0\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " pod="openstack/ceilometer-0" Mar 20 13:44:35 crc kubenswrapper[4895]: I0320 13:44:35.572455 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:36 crc kubenswrapper[4895]: I0320 13:44:36.820665 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:36 crc kubenswrapper[4895]: I0320 13:44:36.890449 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerStarted","Data":"535ef50d5d0060f051726986c643cd4f761bfcaf1f9944a778f4a0e904f238da"} Mar 20 13:44:37 crc kubenswrapper[4895]: I0320 13:44:37.243698 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d370eb51-2043-498f-b94a-11ac6f56f65f" path="/var/lib/kubelet/pods/d370eb51-2043-498f-b94a-11ac6f56f65f/volumes" Mar 20 13:44:37 crc kubenswrapper[4895]: I0320 13:44:37.901258 4895 generic.go:334] "Generic (PLEG): container finished" podID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerID="21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086" exitCode=0 Mar 20 13:44:37 crc kubenswrapper[4895]: I0320 13:44:37.901318 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerDied","Data":"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086"} Mar 20 13:44:37 crc kubenswrapper[4895]: I0320 13:44:37.904026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerStarted","Data":"e8d7353ee69a4f6b9a87a28dba0df9a8cd146086135a781b873f5b45a8ba1c2f"} Mar 20 13:44:38 crc kubenswrapper[4895]: I0320 13:44:38.914280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerStarted","Data":"4565f0a6b6dc98b496e417670f12b596ead4ba71d9bddb2775e1584f2844947b"} Mar 20 13:44:38 crc kubenswrapper[4895]: I0320 13:44:38.918214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerStarted","Data":"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d"} Mar 20 13:44:38 crc kubenswrapper[4895]: I0320 13:44:38.937606 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-226rz" podStartSLOduration=4.157665615 podStartE2EDuration="11.937592215s" podCreationTimestamp="2026-03-20 13:44:27 +0000 UTC" firstStartedPulling="2026-03-20 13:44:30.618621137 +0000 UTC m=+1370.128340103" lastFinishedPulling="2026-03-20 13:44:38.398547737 +0000 UTC m=+1377.908266703" observedRunningTime="2026-03-20 13:44:38.933697849 +0000 UTC m=+1378.443416815" watchObservedRunningTime="2026-03-20 13:44:38.937592215 +0000 UTC m=+1378.447311181" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.129112 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.247666 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.269080 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.472368 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.474016 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.929948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerStarted","Data":"1894ad8a7b553eec8e7c7aa6e1afbdbf5c7eaa68c7352340b3331cd1facad6fd"} Mar 20 13:44:39 crc kubenswrapper[4895]: I0320 13:44:39.963413 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:44:40 crc kubenswrapper[4895]: I0320 13:44:40.482534 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:40 crc kubenswrapper[4895]: I0320 13:44:40.482538 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:41 crc kubenswrapper[4895]: I0320 13:44:41.318341 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:44:41 crc kubenswrapper[4895]: I0320 13:44:41.954219 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerStarted","Data":"8a5915d43781572354cea54e6fc16bbe32d5bcca45952e16c258f6d4a302828f"} Mar 20 13:44:41 crc kubenswrapper[4895]: I0320 13:44:41.954358 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:44:41 crc kubenswrapper[4895]: I0320 13:44:41.977360 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.964720609 podStartE2EDuration="6.977341696s" podCreationTimestamp="2026-03-20 13:44:35 +0000 UTC" firstStartedPulling="2026-03-20 13:44:36.839922302 +0000 UTC m=+1376.349641268" lastFinishedPulling="2026-03-20 13:44:40.852543389 +0000 UTC m=+1380.362262355" observedRunningTime="2026-03-20 13:44:41.97179076 +0000 UTC m=+1381.481509726" watchObservedRunningTime="2026-03-20 13:44:41.977341696 +0000 UTC m=+1381.487060662" Mar 20 13:44:42 crc kubenswrapper[4895]: I0320 13:44:42.348454 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:44:42 crc kubenswrapper[4895]: I0320 13:44:42.349470 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:44:43 crc kubenswrapper[4895]: I0320 13:44:43.431578 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:43 crc kubenswrapper[4895]: I0320 13:44:43.431836 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.231:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:44:47 crc kubenswrapper[4895]: I0320 13:44:47.472232 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:44:47 crc kubenswrapper[4895]: I0320 13:44:47.472843 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:44:48 crc kubenswrapper[4895]: I0320 13:44:48.153457 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:48 crc kubenswrapper[4895]: I0320 13:44:48.153872 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.025412 4895 generic.go:334] "Generic (PLEG): container finished" podID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" containerID="8aa9601c57067c257f81e2d5b1e61583cfebd23ed5f2326fdb664055f7308c10" exitCode=137 Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.025437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"438b09cd-fd26-4ad4-a095-a63130a8e2f7","Type":"ContainerDied","Data":"8aa9601c57067c257f81e2d5b1e61583cfebd23ed5f2326fdb664055f7308c10"} Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.025753 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"438b09cd-fd26-4ad4-a095-a63130a8e2f7","Type":"ContainerDied","Data":"a86cef7a7d9d7b0fd80c5f78cf31054a576c565bbdd54072cf6545629347b760"} Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.025767 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a86cef7a7d9d7b0fd80c5f78cf31054a576c565bbdd54072cf6545629347b760" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.071914 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.165426 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data\") pod \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.165526 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdlx8\" (UniqueName: \"kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8\") pod \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.165711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle\") pod \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\" (UID: \"438b09cd-fd26-4ad4-a095-a63130a8e2f7\") " Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.191406 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8" (OuterVolumeSpecName: "kube-api-access-gdlx8") pod "438b09cd-fd26-4ad4-a095-a63130a8e2f7" (UID: "438b09cd-fd26-4ad4-a095-a63130a8e2f7"). InnerVolumeSpecName "kube-api-access-gdlx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.205949 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-226rz" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" probeResult="failure" output=< Mar 20 13:44:49 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:44:49 crc kubenswrapper[4895]: > Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.206356 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data" (OuterVolumeSpecName: "config-data") pod "438b09cd-fd26-4ad4-a095-a63130a8e2f7" (UID: "438b09cd-fd26-4ad4-a095-a63130a8e2f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.260610 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438b09cd-fd26-4ad4-a095-a63130a8e2f7" (UID: "438b09cd-fd26-4ad4-a095-a63130a8e2f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.267882 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.267915 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdlx8\" (UniqueName: \"kubernetes.io/projected/438b09cd-fd26-4ad4-a095-a63130a8e2f7-kube-api-access-gdlx8\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.267925 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438b09cd-fd26-4ad4-a095-a63130a8e2f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.479581 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.485132 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:44:49 crc kubenswrapper[4895]: I0320 13:44:49.486793 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.036717 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.042977 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.090774 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.103563 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.219591 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:50 crc kubenswrapper[4895]: E0320 13:44:50.219998 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.220015 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.220215 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.220943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.223171 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.229671 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.232304 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.250465 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.293215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.293282 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.293325 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.293571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.293630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhj9\" (UniqueName: \"kubernetes.io/projected/259d9241-bf26-46fe-85ea-8ce9efdf0821-kube-api-access-xkhj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.348845 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.348999 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.395713 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.395791 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.395828 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.395941 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.395964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhj9\" (UniqueName: \"kubernetes.io/projected/259d9241-bf26-46fe-85ea-8ce9efdf0821-kube-api-access-xkhj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.401998 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.402200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.404014 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.405616 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/259d9241-bf26-46fe-85ea-8ce9efdf0821-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.425320 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhj9\" (UniqueName: \"kubernetes.io/projected/259d9241-bf26-46fe-85ea-8ce9efdf0821-kube-api-access-xkhj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"259d9241-bf26-46fe-85ea-8ce9efdf0821\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:50 crc kubenswrapper[4895]: I0320 13:44:50.550071 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:51 crc kubenswrapper[4895]: W0320 13:44:51.071831 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259d9241_bf26_46fe_85ea_8ce9efdf0821.slice/crio-1e88c827a2cfa685a94e5852da2213c276d86603c3e8a795b3ed01ebfc4bd451 WatchSource:0}: Error finding container 1e88c827a2cfa685a94e5852da2213c276d86603c3e8a795b3ed01ebfc4bd451: Status 404 returned error can't find the container with id 1e88c827a2cfa685a94e5852da2213c276d86603c3e8a795b3ed01ebfc4bd451 Mar 20 13:44:51 crc kubenswrapper[4895]: I0320 13:44:51.079246 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:44:51 crc kubenswrapper[4895]: I0320 13:44:51.224354 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438b09cd-fd26-4ad4-a095-a63130a8e2f7" path="/var/lib/kubelet/pods/438b09cd-fd26-4ad4-a095-a63130a8e2f7/volumes" Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.060029 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"259d9241-bf26-46fe-85ea-8ce9efdf0821","Type":"ContainerStarted","Data":"710fd6f7c045db99ab8ec2ad513c594bb2cea8a45f023c137f1f858ba377ef57"} Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.060724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"259d9241-bf26-46fe-85ea-8ce9efdf0821","Type":"ContainerStarted","Data":"1e88c827a2cfa685a94e5852da2213c276d86603c3e8a795b3ed01ebfc4bd451"} Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.082807 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.082789093 podStartE2EDuration="2.082789093s" podCreationTimestamp="2026-03-20 13:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:52.080093977 +0000 UTC m=+1391.589812943" watchObservedRunningTime="2026-03-20 13:44:52.082789093 +0000 UTC m=+1391.592508059" Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.296754 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.296802 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.354038 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.354648 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:44:52 crc kubenswrapper[4895]: I0320 13:44:52.364140 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.087206 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.262669 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.272252 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.293210 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369560 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369640 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369685 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq2b4\" (UniqueName: \"kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369735 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.369804 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472322 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq2b4\" (UniqueName: \"kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472381 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472423 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472480 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.472627 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.473333 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.473577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.473687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.474271 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.474434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.497390 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq2b4\" (UniqueName: \"kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4\") pod \"dnsmasq-dns-54dd998c-qjgm2\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:53 crc kubenswrapper[4895]: I0320 13:44:53.602820 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:54 crc kubenswrapper[4895]: I0320 13:44:54.119303 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:44:54 crc kubenswrapper[4895]: E0320 13:44:54.744451 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod338f95a2_0180_49f2_80b4_46673037665a.slice/crio-conmon-d8291838b6414fe683c44ffbbcea5f6d6dd47c85d02579b39298c4d98734e5cb.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:55 crc kubenswrapper[4895]: I0320 13:44:55.095422 4895 generic.go:334] "Generic (PLEG): container finished" podID="338f95a2-0180-49f2-80b4-46673037665a" containerID="d8291838b6414fe683c44ffbbcea5f6d6dd47c85d02579b39298c4d98734e5cb" exitCode=0 Mar 20 13:44:55 crc kubenswrapper[4895]: I0320 13:44:55.095508 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" event={"ID":"338f95a2-0180-49f2-80b4-46673037665a","Type":"ContainerDied","Data":"d8291838b6414fe683c44ffbbcea5f6d6dd47c85d02579b39298c4d98734e5cb"} Mar 20 13:44:55 crc kubenswrapper[4895]: I0320 13:44:55.095560 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" event={"ID":"338f95a2-0180-49f2-80b4-46673037665a","Type":"ContainerStarted","Data":"912afc9994e1f3304ae43630cb1f0e55e5ba6581676cec12473b7b7b93a70f00"} Mar 20 13:44:55 crc kubenswrapper[4895]: I0320 13:44:55.550940 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:44:55 crc kubenswrapper[4895]: I0320 13:44:55.897061 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.106758 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" event={"ID":"338f95a2-0180-49f2-80b4-46673037665a","Type":"ContainerStarted","Data":"f413ac5d52a19b7db9b3724bb23581cfeee294e48610a7c08b8d88507fda8625"} Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.106929 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-log" containerID="cri-o://e4f1ff0406fcff5005682e9cb3e1e1d4f08fc2fe98cf9ffa41b1dfd7321f129e" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.107022 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-api" containerID="cri-o://3b37f8a33382926f888cb81376135ce08ee0ea083ac13984244f84562db2adf1" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.133772 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" podStartSLOduration=3.133753774 podStartE2EDuration="3.133753774s" podCreationTimestamp="2026-03-20 13:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:56.127040499 +0000 UTC m=+1395.636759465" watchObservedRunningTime="2026-03-20 13:44:56.133753774 +0000 UTC m=+1395.643472740" Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.558825 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.559188 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-central-agent" containerID="cri-o://e8d7353ee69a4f6b9a87a28dba0df9a8cd146086135a781b873f5b45a8ba1c2f" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.559289 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="sg-core" containerID="cri-o://1894ad8a7b553eec8e7c7aa6e1afbdbf5c7eaa68c7352340b3331cd1facad6fd" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.559314 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-notification-agent" containerID="cri-o://4565f0a6b6dc98b496e417670f12b596ead4ba71d9bddb2775e1584f2844947b" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.559498 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="proxy-httpd" containerID="cri-o://8a5915d43781572354cea54e6fc16bbe32d5bcca45952e16c258f6d4a302828f" gracePeriod=30 Mar 20 13:44:56 crc kubenswrapper[4895]: I0320 13:44:56.596178 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.118532 4895 generic.go:334] "Generic (PLEG): container finished" podID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerID="e4f1ff0406fcff5005682e9cb3e1e1d4f08fc2fe98cf9ffa41b1dfd7321f129e" exitCode=143 Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.118653 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerDied","Data":"e4f1ff0406fcff5005682e9cb3e1e1d4f08fc2fe98cf9ffa41b1dfd7321f129e"} Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122544 4895 generic.go:334] "Generic (PLEG): container finished" podID="79767c32-93f3-4097-a2de-eba79345a8e3" containerID="8a5915d43781572354cea54e6fc16bbe32d5bcca45952e16c258f6d4a302828f" exitCode=0 Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122659 4895 generic.go:334] "Generic (PLEG): container finished" podID="79767c32-93f3-4097-a2de-eba79345a8e3" containerID="1894ad8a7b553eec8e7c7aa6e1afbdbf5c7eaa68c7352340b3331cd1facad6fd" exitCode=2 Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122712 4895 generic.go:334] "Generic (PLEG): container finished" podID="79767c32-93f3-4097-a2de-eba79345a8e3" containerID="e8d7353ee69a4f6b9a87a28dba0df9a8cd146086135a781b873f5b45a8ba1c2f" exitCode=0 Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerDied","Data":"8a5915d43781572354cea54e6fc16bbe32d5bcca45952e16c258f6d4a302828f"} Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122851 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerDied","Data":"1894ad8a7b553eec8e7c7aa6e1afbdbf5c7eaa68c7352340b3331cd1facad6fd"} Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.122888 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerDied","Data":"e8d7353ee69a4f6b9a87a28dba0df9a8cd146086135a781b873f5b45a8ba1c2f"} Mar 20 13:44:57 crc kubenswrapper[4895]: I0320 13:44:57.123189 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.135507 4895 generic.go:334] "Generic (PLEG): container finished" podID="79767c32-93f3-4097-a2de-eba79345a8e3" containerID="4565f0a6b6dc98b496e417670f12b596ead4ba71d9bddb2775e1584f2844947b" exitCode=0 Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.136516 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerDied","Data":"4565f0a6b6dc98b496e417670f12b596ead4ba71d9bddb2775e1584f2844947b"} Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.244663 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303100 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqnbw\" (UniqueName: \"kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303203 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303252 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303274 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303297 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303319 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303409 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.303446 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle\") pod \"79767c32-93f3-4097-a2de-eba79345a8e3\" (UID: \"79767c32-93f3-4097-a2de-eba79345a8e3\") " Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.304448 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.305123 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.312601 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts" (OuterVolumeSpecName: "scripts") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.313659 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw" (OuterVolumeSpecName: "kube-api-access-qqnbw") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "kube-api-access-qqnbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.355292 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.384749 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.405964 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqnbw\" (UniqueName: \"kubernetes.io/projected/79767c32-93f3-4097-a2de-eba79345a8e3-kube-api-access-qqnbw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.405997 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.406007 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.406018 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.406026 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.406033 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79767c32-93f3-4097-a2de-eba79345a8e3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.416617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.452711 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data" (OuterVolumeSpecName: "config-data") pod "79767c32-93f3-4097-a2de-eba79345a8e3" (UID: "79767c32-93f3-4097-a2de-eba79345a8e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.508155 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4895]: I0320 13:44:58.508194 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79767c32-93f3-4097-a2de-eba79345a8e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.149122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79767c32-93f3-4097-a2de-eba79345a8e3","Type":"ContainerDied","Data":"535ef50d5d0060f051726986c643cd4f761bfcaf1f9944a778f4a0e904f238da"} Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.149680 4895 scope.go:117] "RemoveContainer" containerID="8a5915d43781572354cea54e6fc16bbe32d5bcca45952e16c258f6d4a302828f" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.149209 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.198509 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.201595 4895 scope.go:117] "RemoveContainer" containerID="1894ad8a7b553eec8e7c7aa6e1afbdbf5c7eaa68c7352340b3331cd1facad6fd" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.236636 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.241263 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-226rz" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" probeResult="failure" output=< Mar 20 13:44:59 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:44:59 crc kubenswrapper[4895]: > Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.243269 4895 scope.go:117] "RemoveContainer" containerID="4565f0a6b6dc98b496e417670f12b596ead4ba71d9bddb2775e1584f2844947b" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251198 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4895]: E0320 13:44:59.251615 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-notification-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251632 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-notification-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: E0320 13:44:59.251663 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="sg-core" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251670 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="sg-core" Mar 20 13:44:59 crc kubenswrapper[4895]: E0320 13:44:59.251682 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-central-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251688 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-central-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: E0320 13:44:59.251699 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="proxy-httpd" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251706 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="proxy-httpd" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251895 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="sg-core" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251911 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-notification-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251928 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="proxy-httpd" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.251945 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" containerName="ceilometer-central-agent" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.253833 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.263108 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.263166 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.263596 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.281089 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.307122 4895 scope.go:117] "RemoveContainer" containerID="e8d7353ee69a4f6b9a87a28dba0df9a8cd146086135a781b873f5b45a8ba1c2f" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322263 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5gv\" (UniqueName: \"kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322800 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.322828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5gv\" (UniqueName: \"kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424842 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424929 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424958 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.424985 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.425773 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.425778 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.430081 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.430218 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.430922 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.431162 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.434677 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.447663 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5gv\" (UniqueName: \"kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv\") pod \"ceilometer-0\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4895]: I0320 13:44:59.590260 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.143254 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9"] Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.145073 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.151753 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.151845 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.173698 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9"] Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.176583 4895 generic.go:334] "Generic (PLEG): container finished" podID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerID="3b37f8a33382926f888cb81376135ce08ee0ea083ac13984244f84562db2adf1" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.176626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerDied","Data":"3b37f8a33382926f888cb81376135ce08ee0ea083ac13984244f84562db2adf1"} Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.176652 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9","Type":"ContainerDied","Data":"d1622500898ecb3d45a71aa3f6b058fd5bf13c34d4e8514767e12f9a90393ef3"} Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.176663 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1622500898ecb3d45a71aa3f6b058fd5bf13c34d4e8514767e12f9a90393ef3" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.237122 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.248101 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w4n7\" (UniqueName: \"kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.248584 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.248643 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.331392 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.350602 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w4n7\" (UniqueName: \"kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.350706 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.350758 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.351683 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.360113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.387733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w4n7\" (UniqueName: \"kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7\") pod \"collect-profiles-29566905-d7bv9\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.452050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle\") pod \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.452179 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfkkz\" (UniqueName: \"kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz\") pod \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.452391 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs\") pod \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.452435 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data\") pod \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\" (UID: \"eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9\") " Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.455817 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs" (OuterVolumeSpecName: "logs") pod "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" (UID: "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.473503 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz" (OuterVolumeSpecName: "kube-api-access-gfkkz") pod "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" (UID: "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9"). InnerVolumeSpecName "kube-api-access-gfkkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.502532 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" (UID: "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.549496 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data" (OuterVolumeSpecName: "config-data") pod "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" (UID: "eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.551349 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.555234 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.555264 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfkkz\" (UniqueName: \"kubernetes.io/projected/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-kube-api-access-gfkkz\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.555275 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.555288 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.578647 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:45:00 crc kubenswrapper[4895]: I0320 13:45:00.628867 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.142843 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.208168 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" event={"ID":"6273873f-7a10-4969-a18b-c041d9500d8b","Type":"ContainerStarted","Data":"0729805b0e91aae9c63d9df8cccd06096acc176a50af0933827772bfa0db8561"} Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.211683 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerStarted","Data":"f46d03abb83b1eef1e620dbf8ff72082789c2d6d1b23a10ec39186adabeb853f"} Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.211734 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerStarted","Data":"3cac05fdf70fb2190639ba84f4d1587bc3945e92c6f2a427302726972dc9f030"} Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.211850 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.245977 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79767c32-93f3-4097-a2de-eba79345a8e3" path="/var/lib/kubelet/pods/79767c32-93f3-4097-a2de-eba79345a8e3/volumes" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.247235 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.384373 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.426452 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.471655 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:01 crc kubenswrapper[4895]: E0320 13:45:01.472056 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-api" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.472068 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-api" Mar 20 13:45:01 crc kubenswrapper[4895]: E0320 13:45:01.472104 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-log" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.472110 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-log" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.472283 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-log" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.472299 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" containerName="nova-api-api" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.473355 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.475778 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.490688 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.490861 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.499164 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596544 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596806 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffmh\" (UniqueName: \"kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.596828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.617514 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vxtht"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.634497 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vxtht"] Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.634631 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.642990 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.643196 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699639 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699752 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffmh\" (UniqueName: \"kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699780 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699840 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699882 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699909 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.699960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.700053 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9fx\" (UniqueName: \"kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.700089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.702029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.729817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.730994 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.738833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.743024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffmh\" (UniqueName: \"kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.743867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.802042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9fx\" (UniqueName: \"kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.802592 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.803187 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.803802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.807106 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.809274 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.810703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.819222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9fx\" (UniqueName: \"kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx\") pod \"nova-cell1-cell-mapping-vxtht\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.840893 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:01 crc kubenswrapper[4895]: I0320 13:45:01.986329 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:02 crc kubenswrapper[4895]: I0320 13:45:02.239860 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerStarted","Data":"9463ca9a976be5cf6d28f485c0eb78ff286439c3b1f2bd8cdc8e765224202d78"} Mar 20 13:45:02 crc kubenswrapper[4895]: I0320 13:45:02.254992 4895 generic.go:334] "Generic (PLEG): container finished" podID="6273873f-7a10-4969-a18b-c041d9500d8b" containerID="586bf06bb776838e6fc1201a63fb4f25cdb2767a45120f8b490c3af9bc1293cf" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4895]: I0320 13:45:02.255523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" event={"ID":"6273873f-7a10-4969-a18b-c041d9500d8b","Type":"ContainerDied","Data":"586bf06bb776838e6fc1201a63fb4f25cdb2767a45120f8b490c3af9bc1293cf"} Mar 20 13:45:02 crc kubenswrapper[4895]: I0320 13:45:02.350590 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:02 crc kubenswrapper[4895]: I0320 13:45:02.810699 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vxtht"] Mar 20 13:45:02 crc kubenswrapper[4895]: W0320 13:45:02.813015 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod293e033c_47da_4d3e_af29_088700965fc1.slice/crio-269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c WatchSource:0}: Error finding container 269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c: Status 404 returned error can't find the container with id 269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.271443 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9" path="/var/lib/kubelet/pods/eee03c8e-9f62-4a8b-8734-5c4f55b3fdb9/volumes" Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.275256 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerStarted","Data":"20cdb5c5d86687cb4d34ea95d0eaa6a21477df07f9810fa048df7e13652f4f04"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.278330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerStarted","Data":"3d84c425c255cea632964f64cd233bdd82ee514ab4a6b96de1f2bdeec35db2d0"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.278377 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerStarted","Data":"156925f6332f21b83e5cfc81c240c0408cb90e749cd35ee11dc37cdf0db758d0"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.278400 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerStarted","Data":"12b928a3c89c2f2432f62a4c122bd0ea280855070ee140bdb019cadc06f9a5ae"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.285803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vxtht" event={"ID":"293e033c-47da-4d3e-af29-088700965fc1","Type":"ContainerStarted","Data":"bbdd1d1f96c305553c0cdb619b80965b4410e25181f8fb5a9566ba1154feba29"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.286011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vxtht" event={"ID":"293e033c-47da-4d3e-af29-088700965fc1","Type":"ContainerStarted","Data":"269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c"} Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.325288 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.325268679 podStartE2EDuration="2.325268679s" podCreationTimestamp="2026-03-20 13:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:03.302194341 +0000 UTC m=+1402.811913307" watchObservedRunningTime="2026-03-20 13:45:03.325268679 +0000 UTC m=+1402.834987645" Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.342359 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vxtht" podStartSLOduration=2.342339859 podStartE2EDuration="2.342339859s" podCreationTimestamp="2026-03-20 13:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:03.326774406 +0000 UTC m=+1402.836493372" watchObservedRunningTime="2026-03-20 13:45:03.342339859 +0000 UTC m=+1402.852058825" Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.604649 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.739502 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:45:03 crc kubenswrapper[4895]: I0320 13:45:03.739768 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="dnsmasq-dns" containerID="cri-o://7140b863324081883ed3fb6bbcbc4576d67787181aeca296da749830fde15fc9" gracePeriod=10 Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.148916 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.184038 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume\") pod \"6273873f-7a10-4969-a18b-c041d9500d8b\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.184152 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w4n7\" (UniqueName: \"kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7\") pod \"6273873f-7a10-4969-a18b-c041d9500d8b\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.184337 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume\") pod \"6273873f-7a10-4969-a18b-c041d9500d8b\" (UID: \"6273873f-7a10-4969-a18b-c041d9500d8b\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.184741 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "6273873f-7a10-4969-a18b-c041d9500d8b" (UID: "6273873f-7a10-4969-a18b-c041d9500d8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.202573 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6273873f-7a10-4969-a18b-c041d9500d8b" (UID: "6273873f-7a10-4969-a18b-c041d9500d8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.203894 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7" (OuterVolumeSpecName: "kube-api-access-2w4n7") pod "6273873f-7a10-4969-a18b-c041d9500d8b" (UID: "6273873f-7a10-4969-a18b-c041d9500d8b"). InnerVolumeSpecName "kube-api-access-2w4n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.286511 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6273873f-7a10-4969-a18b-c041d9500d8b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.286543 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6273873f-7a10-4969-a18b-c041d9500d8b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.286553 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w4n7\" (UniqueName: \"kubernetes.io/projected/6273873f-7a10-4969-a18b-c041d9500d8b-kube-api-access-2w4n7\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.317777 4895 generic.go:334] "Generic (PLEG): container finished" podID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerID="7140b863324081883ed3fb6bbcbc4576d67787181aeca296da749830fde15fc9" exitCode=0 Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.317844 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" event={"ID":"fbf08ec8-de92-4326-9567-c6fe64dfa07e","Type":"ContainerDied","Data":"7140b863324081883ed3fb6bbcbc4576d67787181aeca296da749830fde15fc9"} Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.330573 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.331484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9" event={"ID":"6273873f-7a10-4969-a18b-c041d9500d8b","Type":"ContainerDied","Data":"0729805b0e91aae9c63d9df8cccd06096acc176a50af0933827772bfa0db8561"} Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.331557 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0729805b0e91aae9c63d9df8cccd06096acc176a50af0933827772bfa0db8561" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.834308 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897430 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897522 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897554 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qsgd\" (UniqueName: \"kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897720 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.897819 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb\") pod \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\" (UID: \"fbf08ec8-de92-4326-9567-c6fe64dfa07e\") " Mar 20 13:45:04 crc kubenswrapper[4895]: I0320 13:45:04.919615 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd" (OuterVolumeSpecName: "kube-api-access-6qsgd") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "kube-api-access-6qsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.008353 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qsgd\" (UniqueName: \"kubernetes.io/projected/fbf08ec8-de92-4326-9567-c6fe64dfa07e-kube-api-access-6qsgd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.022139 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.035188 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.038212 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.043283 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config" (OuterVolumeSpecName: "config") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.059589 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbf08ec8-de92-4326-9567-c6fe64dfa07e" (UID: "fbf08ec8-de92-4326-9567-c6fe64dfa07e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.110232 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.110256 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.110266 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.110277 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.110285 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf08ec8-de92-4326-9567-c6fe64dfa07e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.341846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" event={"ID":"fbf08ec8-de92-4326-9567-c6fe64dfa07e","Type":"ContainerDied","Data":"9156313be49edf7d56594beb875f478b5514d2242481276d5354df571eb3f03f"} Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.341890 4895 scope.go:117] "RemoveContainer" containerID="7140b863324081883ed3fb6bbcbc4576d67787181aeca296da749830fde15fc9" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.341912 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-884c8b8f5-jkq4x" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.367148 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.372792 4895 scope.go:117] "RemoveContainer" containerID="b310e722bb88098ec3f93098c414ebe861a79c4841e205a7f00791b9b36ed8fb" Mar 20 13:45:05 crc kubenswrapper[4895]: I0320 13:45:05.376205 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-884c8b8f5-jkq4x"] Mar 20 13:45:06 crc kubenswrapper[4895]: I0320 13:45:06.353802 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerStarted","Data":"64dfb20695cc21cb5780a961102900a3efbf2872b128113c0c165809758d6d46"} Mar 20 13:45:06 crc kubenswrapper[4895]: I0320 13:45:06.354247 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:06 crc kubenswrapper[4895]: I0320 13:45:06.376561 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.719906688 podStartE2EDuration="7.376538841s" podCreationTimestamp="2026-03-20 13:44:59 +0000 UTC" firstStartedPulling="2026-03-20 13:45:00.245635816 +0000 UTC m=+1399.755354782" lastFinishedPulling="2026-03-20 13:45:05.902267969 +0000 UTC m=+1405.411986935" observedRunningTime="2026-03-20 13:45:06.371795065 +0000 UTC m=+1405.881514031" watchObservedRunningTime="2026-03-20 13:45:06.376538841 +0000 UTC m=+1405.886257807" Mar 20 13:45:07 crc kubenswrapper[4895]: I0320 13:45:07.227173 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" path="/var/lib/kubelet/pods/fbf08ec8-de92-4326-9567-c6fe64dfa07e/volumes" Mar 20 13:45:09 crc kubenswrapper[4895]: I0320 13:45:09.216943 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-226rz" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" probeResult="failure" output=< Mar 20 13:45:09 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:45:09 crc kubenswrapper[4895]: > Mar 20 13:45:09 crc kubenswrapper[4895]: I0320 13:45:09.388906 4895 generic.go:334] "Generic (PLEG): container finished" podID="293e033c-47da-4d3e-af29-088700965fc1" containerID="bbdd1d1f96c305553c0cdb619b80965b4410e25181f8fb5a9566ba1154feba29" exitCode=0 Mar 20 13:45:09 crc kubenswrapper[4895]: I0320 13:45:09.388948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vxtht" event={"ID":"293e033c-47da-4d3e-af29-088700965fc1","Type":"ContainerDied","Data":"bbdd1d1f96c305553c0cdb619b80965b4410e25181f8fb5a9566ba1154feba29"} Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.255761 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.341771 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle\") pod \"293e033c-47da-4d3e-af29-088700965fc1\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.341976 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts\") pod \"293e033c-47da-4d3e-af29-088700965fc1\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.342083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq9fx\" (UniqueName: \"kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx\") pod \"293e033c-47da-4d3e-af29-088700965fc1\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.342111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data\") pod \"293e033c-47da-4d3e-af29-088700965fc1\" (UID: \"293e033c-47da-4d3e-af29-088700965fc1\") " Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.347501 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx" (OuterVolumeSpecName: "kube-api-access-cq9fx") pod "293e033c-47da-4d3e-af29-088700965fc1" (UID: "293e033c-47da-4d3e-af29-088700965fc1"). InnerVolumeSpecName "kube-api-access-cq9fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.356228 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts" (OuterVolumeSpecName: "scripts") pod "293e033c-47da-4d3e-af29-088700965fc1" (UID: "293e033c-47da-4d3e-af29-088700965fc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.382717 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "293e033c-47da-4d3e-af29-088700965fc1" (UID: "293e033c-47da-4d3e-af29-088700965fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.384579 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data" (OuterVolumeSpecName: "config-data") pod "293e033c-47da-4d3e-af29-088700965fc1" (UID: "293e033c-47da-4d3e-af29-088700965fc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.423690 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vxtht" event={"ID":"293e033c-47da-4d3e-af29-088700965fc1","Type":"ContainerDied","Data":"269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c"} Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.423763 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269ca1a2f5f4872f146d4ce6b92049545e52ad0865454600b4766cc377bd034c" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.423856 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vxtht" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.444355 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.444409 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq9fx\" (UniqueName: \"kubernetes.io/projected/293e033c-47da-4d3e-af29-088700965fc1-kube-api-access-cq9fx\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.444424 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.444436 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293e033c-47da-4d3e-af29-088700965fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.590180 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.590638 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerName="nova-scheduler-scheduler" containerID="cri-o://a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" gracePeriod=30 Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.600852 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.601079 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-log" containerID="cri-o://156925f6332f21b83e5cfc81c240c0408cb90e749cd35ee11dc37cdf0db758d0" gracePeriod=30 Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.601149 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-api" containerID="cri-o://3d84c425c255cea632964f64cd233bdd82ee514ab4a6b96de1f2bdeec35db2d0" gracePeriod=30 Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.673114 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.673409 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-log" containerID="cri-o://09879d3e625a0766e94659b2099c44ac187b1969ae5b9414269da3bb7c9b8950" gracePeriod=30 Mar 20 13:45:11 crc kubenswrapper[4895]: I0320 13:45:11.673503 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-metadata" containerID="cri-o://af1352100d2092d7137fe012465a11f10fd84f26bba6920925bdadc92648b206" gracePeriod=30 Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.434518 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerID="3d84c425c255cea632964f64cd233bdd82ee514ab4a6b96de1f2bdeec35db2d0" exitCode=0 Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.434547 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerID="156925f6332f21b83e5cfc81c240c0408cb90e749cd35ee11dc37cdf0db758d0" exitCode=143 Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.434608 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerDied","Data":"3d84c425c255cea632964f64cd233bdd82ee514ab4a6b96de1f2bdeec35db2d0"} Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.434660 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerDied","Data":"156925f6332f21b83e5cfc81c240c0408cb90e749cd35ee11dc37cdf0db758d0"} Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.436037 4895 generic.go:334] "Generic (PLEG): container finished" podID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerID="09879d3e625a0766e94659b2099c44ac187b1969ae5b9414269da3bb7c9b8950" exitCode=143 Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.436066 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerDied","Data":"09879d3e625a0766e94659b2099c44ac187b1969ae5b9414269da3bb7c9b8950"} Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.573605 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.667883 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.668686 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qffmh\" (UniqueName: \"kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.668727 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.668770 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.668938 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.668977 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data\") pod \"ef7eabd3-9963-4df4-a388-e01a242b90a6\" (UID: \"ef7eabd3-9963-4df4-a388-e01a242b90a6\") " Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.669288 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs" (OuterVolumeSpecName: "logs") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.669743 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7eabd3-9963-4df4-a388-e01a242b90a6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.676140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh" (OuterVolumeSpecName: "kube-api-access-qffmh") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "kube-api-access-qffmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.704488 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.710913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data" (OuterVolumeSpecName: "config-data") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.736842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.737207 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef7eabd3-9963-4df4-a388-e01a242b90a6" (UID: "ef7eabd3-9963-4df4-a388-e01a242b90a6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.771513 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.771554 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qffmh\" (UniqueName: \"kubernetes.io/projected/ef7eabd3-9963-4df4-a388-e01a242b90a6-kube-api-access-qffmh\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.771570 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.771585 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4895]: I0320 13:45:12.771597 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7eabd3-9963-4df4-a388-e01a242b90a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.449486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7eabd3-9963-4df4-a388-e01a242b90a6","Type":"ContainerDied","Data":"12b928a3c89c2f2432f62a4c122bd0ea280855070ee140bdb019cadc06f9a5ae"} Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.449825 4895 scope.go:117] "RemoveContainer" containerID="3d84c425c255cea632964f64cd233bdd82ee514ab4a6b96de1f2bdeec35db2d0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.449549 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.489854 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.494164 4895 scope.go:117] "RemoveContainer" containerID="156925f6332f21b83e5cfc81c240c0408cb90e749cd35ee11dc37cdf0db758d0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.516173 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.531768 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533797 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-api" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533822 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-api" Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533831 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-log" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533837 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-log" Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533856 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="dnsmasq-dns" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533862 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="dnsmasq-dns" Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533872 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="init" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533878 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="init" Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533904 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293e033c-47da-4d3e-af29-088700965fc1" containerName="nova-manage" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533911 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="293e033c-47da-4d3e-af29-088700965fc1" containerName="nova-manage" Mar 20 13:45:13 crc kubenswrapper[4895]: E0320 13:45:13.533921 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6273873f-7a10-4969-a18b-c041d9500d8b" containerName="collect-profiles" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.533927 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6273873f-7a10-4969-a18b-c041d9500d8b" containerName="collect-profiles" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.534111 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6273873f-7a10-4969-a18b-c041d9500d8b" containerName="collect-profiles" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.534124 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-log" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.534135 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="293e033c-47da-4d3e-af29-088700965fc1" containerName="nova-manage" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.534151 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf08ec8-de92-4326-9567-c6fe64dfa07e" containerName="dnsmasq-dns" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.534162 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" containerName="nova-api-api" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.543895 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.546587 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.546741 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.546863 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.549043 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589116 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55tv9\" (UniqueName: \"kubernetes.io/projected/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-kube-api-access-55tv9\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589170 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-logs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589271 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589293 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-config-data\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.589445 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691366 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55tv9\" (UniqueName: \"kubernetes.io/projected/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-kube-api-access-55tv9\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691464 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691488 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-logs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691532 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691546 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-config-data\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.691711 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.692309 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-logs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.697041 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.697656 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-config-data\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.707036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.715566 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55tv9\" (UniqueName: \"kubernetes.io/projected/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-kube-api-access-55tv9\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.715698 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6013cf3c-ce92-4c95-b649-5a7d05f4e1fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd\") " pod="openstack/nova-api-0" Mar 20 13:45:13 crc kubenswrapper[4895]: I0320 13:45:13.870563 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:45:14 crc kubenswrapper[4895]: E0320 13:45:14.130253 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7 is running failed: container process not found" containerID="a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:45:14 crc kubenswrapper[4895]: E0320 13:45:14.133544 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7 is running failed: container process not found" containerID="a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:45:14 crc kubenswrapper[4895]: E0320 13:45:14.134513 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7 is running failed: container process not found" containerID="a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:45:14 crc kubenswrapper[4895]: E0320 13:45:14.134586 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerName="nova-scheduler-scheduler" Mar 20 13:45:14 crc kubenswrapper[4895]: I0320 13:45:14.480098 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerID="a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" exitCode=0 Mar 20 13:45:14 crc kubenswrapper[4895]: I0320 13:45:14.480214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3854bcd-8336-4aac-94e3-1b48dbef874e","Type":"ContainerDied","Data":"a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7"} Mar 20 13:45:14 crc kubenswrapper[4895]: I0320 13:45:14.642825 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:45:14 crc kubenswrapper[4895]: I0320 13:45:14.960188 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.027457 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle\") pod \"e3854bcd-8336-4aac-94e3-1b48dbef874e\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.027532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data\") pod \"e3854bcd-8336-4aac-94e3-1b48dbef874e\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.027646 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4hg\" (UniqueName: \"kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg\") pod \"e3854bcd-8336-4aac-94e3-1b48dbef874e\" (UID: \"e3854bcd-8336-4aac-94e3-1b48dbef874e\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.032864 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg" (OuterVolumeSpecName: "kube-api-access-rk4hg") pod "e3854bcd-8336-4aac-94e3-1b48dbef874e" (UID: "e3854bcd-8336-4aac-94e3-1b48dbef874e"). InnerVolumeSpecName "kube-api-access-rk4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.068946 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data" (OuterVolumeSpecName: "config-data") pod "e3854bcd-8336-4aac-94e3-1b48dbef874e" (UID: "e3854bcd-8336-4aac-94e3-1b48dbef874e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.092870 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3854bcd-8336-4aac-94e3-1b48dbef874e" (UID: "e3854bcd-8336-4aac-94e3-1b48dbef874e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.150350 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.150383 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3854bcd-8336-4aac-94e3-1b48dbef874e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.150407 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4hg\" (UniqueName: \"kubernetes.io/projected/e3854bcd-8336-4aac-94e3-1b48dbef874e-kube-api-access-rk4hg\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.242674 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7eabd3-9963-4df4-a388-e01a242b90a6" path="/var/lib/kubelet/pods/ef7eabd3-9963-4df4-a388-e01a242b90a6/volumes" Mar 20 13:45:15 crc kubenswrapper[4895]: E0320 13:45:15.400145 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3854bcd_8336_4aac_94e3_1b48dbef874e.slice/crio-9201f3752ca4224daea43a5e454c5c9e4cc3e1d8b841989a6208b376bccf2f6c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3854bcd_8336_4aac_94e3_1b48dbef874e.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.525146 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.525226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3854bcd-8336-4aac-94e3-1b48dbef874e","Type":"ContainerDied","Data":"9201f3752ca4224daea43a5e454c5c9e4cc3e1d8b841989a6208b376bccf2f6c"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.525669 4895 scope.go:117] "RemoveContainer" containerID="a705ec6084c0d8bab2a61bf9020a755ac3aeb8c5e9ea1ec47b1f3327bed694c7" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.529534 4895 generic.go:334] "Generic (PLEG): container finished" podID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerID="af1352100d2092d7137fe012465a11f10fd84f26bba6920925bdadc92648b206" exitCode=0 Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.529612 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerDied","Data":"af1352100d2092d7137fe012465a11f10fd84f26bba6920925bdadc92648b206"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.529656 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b0fde814-3ccb-4c95-915d-fa586ca8a578","Type":"ContainerDied","Data":"e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.529667 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07a923840c1268340637cac746021ac9929932924e2cae7165b828995ff15ee" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.531739 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd","Type":"ContainerStarted","Data":"c27fec20e11afa6ff2ee56fa7cf4cfb3a22c687cb1d210993b12d2e1671007b9"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.531767 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd","Type":"ContainerStarted","Data":"f30ff762e6fbd4d8fead1e675bdc746f686471114f7feb26d762ac017f4e01d3"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.531777 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6013cf3c-ce92-4c95-b649-5a7d05f4e1fd","Type":"ContainerStarted","Data":"1d9986138d1ac841745c756e724ba97f375ffc4b41891968335df0cedc24a2f2"} Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.553136 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.553118986 podStartE2EDuration="2.553118986s" podCreationTimestamp="2026-03-20 13:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:15.552287856 +0000 UTC m=+1415.062006822" watchObservedRunningTime="2026-03-20 13:45:15.553118986 +0000 UTC m=+1415.062837952" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.581528 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.610357 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.635188 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.650647 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4895]: E0320 13:45:15.651120 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-log" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651131 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-log" Mar 20 13:45:15 crc kubenswrapper[4895]: E0320 13:45:15.651142 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerName="nova-scheduler-scheduler" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651148 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerName="nova-scheduler-scheduler" Mar 20 13:45:15 crc kubenswrapper[4895]: E0320 13:45:15.651163 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-metadata" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651169 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-metadata" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651348 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-log" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651359 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" containerName="nova-metadata-metadata" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.651376 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" containerName="nova-scheduler-scheduler" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.652134 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.656167 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.662232 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.663511 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle\") pod \"b0fde814-3ccb-4c95-915d-fa586ca8a578\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.663618 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data\") pod \"b0fde814-3ccb-4c95-915d-fa586ca8a578\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.663669 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs\") pod \"b0fde814-3ccb-4c95-915d-fa586ca8a578\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.663807 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs\") pod \"b0fde814-3ccb-4c95-915d-fa586ca8a578\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.663830 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdpj\" (UniqueName: \"kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj\") pod \"b0fde814-3ccb-4c95-915d-fa586ca8a578\" (UID: \"b0fde814-3ccb-4c95-915d-fa586ca8a578\") " Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.667570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs" (OuterVolumeSpecName: "logs") pod "b0fde814-3ccb-4c95-915d-fa586ca8a578" (UID: "b0fde814-3ccb-4c95-915d-fa586ca8a578"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.684381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj" (OuterVolumeSpecName: "kube-api-access-2jdpj") pod "b0fde814-3ccb-4c95-915d-fa586ca8a578" (UID: "b0fde814-3ccb-4c95-915d-fa586ca8a578"). InnerVolumeSpecName "kube-api-access-2jdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.695242 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data" (OuterVolumeSpecName: "config-data") pod "b0fde814-3ccb-4c95-915d-fa586ca8a578" (UID: "b0fde814-3ccb-4c95-915d-fa586ca8a578"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.736047 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b0fde814-3ccb-4c95-915d-fa586ca8a578" (UID: "b0fde814-3ccb-4c95-915d-fa586ca8a578"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.741444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0fde814-3ccb-4c95-915d-fa586ca8a578" (UID: "b0fde814-3ccb-4c95-915d-fa586ca8a578"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.768751 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.768867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-config-data\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.768949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wwk\" (UniqueName: \"kubernetes.io/projected/d195aa63-ddd2-44d4-b7ec-fc6761422619-kube-api-access-m6wwk\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.769021 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.769031 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fde814-3ccb-4c95-915d-fa586ca8a578-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.769042 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.769051 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdpj\" (UniqueName: \"kubernetes.io/projected/b0fde814-3ccb-4c95-915d-fa586ca8a578-kube-api-access-2jdpj\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.769059 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fde814-3ccb-4c95-915d-fa586ca8a578-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.870301 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-config-data\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.870468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wwk\" (UniqueName: \"kubernetes.io/projected/d195aa63-ddd2-44d4-b7ec-fc6761422619-kube-api-access-m6wwk\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.870551 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.874366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.874865 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d195aa63-ddd2-44d4-b7ec-fc6761422619-config-data\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4895]: I0320 13:45:15.886701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wwk\" (UniqueName: \"kubernetes.io/projected/d195aa63-ddd2-44d4-b7ec-fc6761422619-kube-api-access-m6wwk\") pod \"nova-scheduler-0\" (UID: \"d195aa63-ddd2-44d4-b7ec-fc6761422619\") " pod="openstack/nova-scheduler-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.080669 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.541815 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.562128 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.727253 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.740246 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.758511 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.760129 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.763699 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.763918 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.774981 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.892000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.892078 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-config-data\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.892098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/155eaf40-0b01-4dba-af34-0fce0b907680-logs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.892179 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.892295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdd8\" (UniqueName: \"kubernetes.io/projected/155eaf40-0b01-4dba-af34-0fce0b907680-kube-api-access-xwdd8\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994049 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994123 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-config-data\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994144 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/155eaf40-0b01-4dba-af34-0fce0b907680-logs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994222 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdd8\" (UniqueName: \"kubernetes.io/projected/155eaf40-0b01-4dba-af34-0fce0b907680-kube-api-access-xwdd8\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.994772 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/155eaf40-0b01-4dba-af34-0fce0b907680-logs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.998713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-config-data\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:16 crc kubenswrapper[4895]: I0320 13:45:16.998824 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.000905 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/155eaf40-0b01-4dba-af34-0fce0b907680-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.012370 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdd8\" (UniqueName: \"kubernetes.io/projected/155eaf40-0b01-4dba-af34-0fce0b907680-kube-api-access-xwdd8\") pod \"nova-metadata-0\" (UID: \"155eaf40-0b01-4dba-af34-0fce0b907680\") " pod="openstack/nova-metadata-0" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.078893 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.226368 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fde814-3ccb-4c95-915d-fa586ca8a578" path="/var/lib/kubelet/pods/b0fde814-3ccb-4c95-915d-fa586ca8a578/volumes" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.227000 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3854bcd-8336-4aac-94e3-1b48dbef874e" path="/var/lib/kubelet/pods/e3854bcd-8336-4aac-94e3-1b48dbef874e/volumes" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.555919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d195aa63-ddd2-44d4-b7ec-fc6761422619","Type":"ContainerStarted","Data":"8588ce1e8aa8e03ca06a369b67e20a1384d325d897b35d574945d39b68b46528"} Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.555968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d195aa63-ddd2-44d4-b7ec-fc6761422619","Type":"ContainerStarted","Data":"437bbc95bc9e5b5c56baff99a8a683cc9e4b6bb8f6ccc1b58fa57bf4aa1a0c59"} Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.585523 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.585504892 podStartE2EDuration="2.585504892s" podCreationTimestamp="2026-03-20 13:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:17.573112016 +0000 UTC m=+1417.082830982" watchObservedRunningTime="2026-03-20 13:45:17.585504892 +0000 UTC m=+1417.095223858" Mar 20 13:45:17 crc kubenswrapper[4895]: I0320 13:45:17.627663 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:45:17 crc kubenswrapper[4895]: W0320 13:45:17.629585 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod155eaf40_0b01_4dba_af34_0fce0b907680.slice/crio-84c426d71f6875f92cab0beb3c09490f09531442daab8d78ff403783d5e6dd79 WatchSource:0}: Error finding container 84c426d71f6875f92cab0beb3c09490f09531442daab8d78ff403783d5e6dd79: Status 404 returned error can't find the container with id 84c426d71f6875f92cab0beb3c09490f09531442daab8d78ff403783d5e6dd79 Mar 20 13:45:18 crc kubenswrapper[4895]: I0320 13:45:18.569605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"155eaf40-0b01-4dba-af34-0fce0b907680","Type":"ContainerStarted","Data":"ef50a7199c4ea390c1533e2e7591755c00e53dd096c2222b8c80e3315a612f1f"} Mar 20 13:45:18 crc kubenswrapper[4895]: I0320 13:45:18.571138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"155eaf40-0b01-4dba-af34-0fce0b907680","Type":"ContainerStarted","Data":"6934440579b53ee57d307a1c1d0f0c0a985b7979dc52bfd5473aff913e3df94e"} Mar 20 13:45:18 crc kubenswrapper[4895]: I0320 13:45:18.571225 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"155eaf40-0b01-4dba-af34-0fce0b907680","Type":"ContainerStarted","Data":"84c426d71f6875f92cab0beb3c09490f09531442daab8d78ff403783d5e6dd79"} Mar 20 13:45:18 crc kubenswrapper[4895]: I0320 13:45:18.592129 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.592109568 podStartE2EDuration="2.592109568s" podCreationTimestamp="2026-03-20 13:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:18.586969071 +0000 UTC m=+1418.096688037" watchObservedRunningTime="2026-03-20 13:45:18.592109568 +0000 UTC m=+1418.101828534" Mar 20 13:45:19 crc kubenswrapper[4895]: I0320 13:45:19.203929 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-226rz" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" probeResult="failure" output=< Mar 20 13:45:19 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:45:19 crc kubenswrapper[4895]: > Mar 20 13:45:21 crc kubenswrapper[4895]: I0320 13:45:21.081477 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.301758 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.301827 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.301875 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.302621 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.302670 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf" gracePeriod=600 Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.650951 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf" exitCode=0 Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.651032 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf"} Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.651281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732"} Mar 20 13:45:22 crc kubenswrapper[4895]: I0320 13:45:22.651304 4895 scope.go:117] "RemoveContainer" containerID="cb408bd659a280d8aef8f72a90961fbfc134ddc96013e0c852f8ece7da9a11f5" Mar 20 13:45:23 crc kubenswrapper[4895]: I0320 13:45:23.870795 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:45:23 crc kubenswrapper[4895]: I0320 13:45:23.871151 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:45:24 crc kubenswrapper[4895]: I0320 13:45:24.885657 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6013cf3c-ce92-4c95-b649-5a7d05f4e1fd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:24 crc kubenswrapper[4895]: I0320 13:45:24.885681 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6013cf3c-ce92-4c95-b649-5a7d05f4e1fd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:26 crc kubenswrapper[4895]: I0320 13:45:26.111011 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:45:26 crc kubenswrapper[4895]: I0320 13:45:26.153760 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:45:26 crc kubenswrapper[4895]: I0320 13:45:26.718854 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:45:27 crc kubenswrapper[4895]: I0320 13:45:27.079791 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:45:27 crc kubenswrapper[4895]: I0320 13:45:27.079871 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:45:28 crc kubenswrapper[4895]: I0320 13:45:28.091604 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="155eaf40-0b01-4dba-af34-0fce0b907680" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:28 crc kubenswrapper[4895]: I0320 13:45:28.091614 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="155eaf40-0b01-4dba-af34-0fce0b907680" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:28 crc kubenswrapper[4895]: I0320 13:45:28.197565 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:45:28 crc kubenswrapper[4895]: I0320 13:45:28.240778 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:45:29 crc kubenswrapper[4895]: I0320 13:45:29.008286 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:45:29 crc kubenswrapper[4895]: I0320 13:45:29.613057 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:45:29 crc kubenswrapper[4895]: I0320 13:45:29.753695 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-226rz" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" containerID="cri-o://81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d" gracePeriod=2 Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.620873 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.740725 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content\") pod \"a0efc230-d08e-425c-87cb-96b47fa4474a\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.740808 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities\") pod \"a0efc230-d08e-425c-87cb-96b47fa4474a\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.740842 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2mx\" (UniqueName: \"kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx\") pod \"a0efc230-d08e-425c-87cb-96b47fa4474a\" (UID: \"a0efc230-d08e-425c-87cb-96b47fa4474a\") " Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.741287 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities" (OuterVolumeSpecName: "utilities") pod "a0efc230-d08e-425c-87cb-96b47fa4474a" (UID: "a0efc230-d08e-425c-87cb-96b47fa4474a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.741525 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.754685 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx" (OuterVolumeSpecName: "kube-api-access-ph2mx") pod "a0efc230-d08e-425c-87cb-96b47fa4474a" (UID: "a0efc230-d08e-425c-87cb-96b47fa4474a"). InnerVolumeSpecName "kube-api-access-ph2mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.763986 4895 generic.go:334] "Generic (PLEG): container finished" podID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerID="81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d" exitCode=0 Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.764036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerDied","Data":"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d"} Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.764050 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-226rz" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.764077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-226rz" event={"ID":"a0efc230-d08e-425c-87cb-96b47fa4474a","Type":"ContainerDied","Data":"eefdf3977d6f81197a3cc478dcf7eba91ab43712088a4eaf55a71096d05f0acc"} Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.764101 4895 scope.go:117] "RemoveContainer" containerID="81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.807031 4895 scope.go:117] "RemoveContainer" containerID="21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.843656 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2mx\" (UniqueName: \"kubernetes.io/projected/a0efc230-d08e-425c-87cb-96b47fa4474a-kube-api-access-ph2mx\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.843683 4895 scope.go:117] "RemoveContainer" containerID="009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.873255 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0efc230-d08e-425c-87cb-96b47fa4474a" (UID: "a0efc230-d08e-425c-87cb-96b47fa4474a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.880166 4895 scope.go:117] "RemoveContainer" containerID="81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d" Mar 20 13:45:30 crc kubenswrapper[4895]: E0320 13:45:30.880715 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d\": container with ID starting with 81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d not found: ID does not exist" containerID="81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.880756 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d"} err="failed to get container status \"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d\": rpc error: code = NotFound desc = could not find container \"81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d\": container with ID starting with 81ea13026473db3cb2278b67711411ff8399501e9af69a5adf5fc2f172d2360d not found: ID does not exist" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.880782 4895 scope.go:117] "RemoveContainer" containerID="21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086" Mar 20 13:45:30 crc kubenswrapper[4895]: E0320 13:45:30.881273 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086\": container with ID starting with 21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086 not found: ID does not exist" containerID="21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.881300 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086"} err="failed to get container status \"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086\": rpc error: code = NotFound desc = could not find container \"21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086\": container with ID starting with 21dec5bdaca1822722c9bac60a3dec84bff454c8e277fde51fcddc7a2a091086 not found: ID does not exist" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.881314 4895 scope.go:117] "RemoveContainer" containerID="009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c" Mar 20 13:45:30 crc kubenswrapper[4895]: E0320 13:45:30.882276 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c\": container with ID starting with 009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c not found: ID does not exist" containerID="009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.882300 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c"} err="failed to get container status \"009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c\": rpc error: code = NotFound desc = could not find container \"009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c\": container with ID starting with 009224d109da5a7c385abe6a32c72312aa1c84a0e05c84df7153fc15a2ef9b6c not found: ID does not exist" Mar 20 13:45:30 crc kubenswrapper[4895]: I0320 13:45:30.945388 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0efc230-d08e-425c-87cb-96b47fa4474a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:31 crc kubenswrapper[4895]: I0320 13:45:31.102674 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:45:31 crc kubenswrapper[4895]: I0320 13:45:31.116294 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-226rz"] Mar 20 13:45:31 crc kubenswrapper[4895]: I0320 13:45:31.227977 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" path="/var/lib/kubelet/pods/a0efc230-d08e-425c-87cb-96b47fa4474a/volumes" Mar 20 13:45:31 crc kubenswrapper[4895]: I0320 13:45:31.870870 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:45:31 crc kubenswrapper[4895]: I0320 13:45:31.871225 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:45:33 crc kubenswrapper[4895]: I0320 13:45:33.879664 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:45:33 crc kubenswrapper[4895]: I0320 13:45:33.883719 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:45:33 crc kubenswrapper[4895]: I0320 13:45:33.886466 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:45:34 crc kubenswrapper[4895]: I0320 13:45:34.804738 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:45:35 crc kubenswrapper[4895]: I0320 13:45:35.079438 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:45:35 crc kubenswrapper[4895]: I0320 13:45:35.080204 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:45:37 crc kubenswrapper[4895]: I0320 13:45:37.088623 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:45:37 crc kubenswrapper[4895]: I0320 13:45:37.091872 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:45:37 crc kubenswrapper[4895]: I0320 13:45:37.093778 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:45:37 crc kubenswrapper[4895]: I0320 13:45:37.832605 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.590447 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-lh78p"] Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.599563 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-lh78p"] Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.701436 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-d69kx"] Mar 20 13:45:46 crc kubenswrapper[4895]: E0320 13:45:46.701960 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.701987 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" Mar 20 13:45:46 crc kubenswrapper[4895]: E0320 13:45:46.702015 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="extract-content" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.702025 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="extract-content" Mar 20 13:45:46 crc kubenswrapper[4895]: E0320 13:45:46.705564 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="extract-utilities" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.705582 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="extract-utilities" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.706789 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0efc230-d08e-425c-87cb-96b47fa4474a" containerName="registry-server" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.708187 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.710600 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.721863 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-d69kx"] Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.863381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.863467 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.863493 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.863763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.863868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4bql\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.965903 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.965966 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.965991 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.966100 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.966143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4bql\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.972478 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.973098 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.973443 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.974231 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:46 crc kubenswrapper[4895]: I0320 13:45:46.984314 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4bql\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql\") pod \"cloudkitty-db-sync-d69kx\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:47 crc kubenswrapper[4895]: I0320 13:45:47.025225 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:45:47 crc kubenswrapper[4895]: I0320 13:45:47.244688 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c" path="/var/lib/kubelet/pods/8ac3ac6a-ec84-475e-9600-a8eb5e7c0c7c/volumes" Mar 20 13:45:47 crc kubenswrapper[4895]: I0320 13:45:47.660930 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-d69kx"] Mar 20 13:45:47 crc kubenswrapper[4895]: I0320 13:45:47.665420 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:45:47 crc kubenswrapper[4895]: I0320 13:45:47.948417 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d69kx" event={"ID":"bb5ee4b2-1013-4687-b3aa-df5362f4b435","Type":"ContainerStarted","Data":"cd463d9d0e0758419ba03477b06d13d2f1f4d8ebc855d9541934a6bc0328fe2c"} Mar 20 13:45:48 crc kubenswrapper[4895]: I0320 13:45:48.978060 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.052596 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.160170 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.160689 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-central-agent" containerID="cri-o://f46d03abb83b1eef1e620dbf8ff72082789c2d6d1b23a10ec39186adabeb853f" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.160864 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="sg-core" containerID="cri-o://20cdb5c5d86687cb4d34ea95d0eaa6a21477df07f9810fa048df7e13652f4f04" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.160916 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="proxy-httpd" containerID="cri-o://64dfb20695cc21cb5780a961102900a3efbf2872b128113c0c165809758d6d46" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.160934 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-notification-agent" containerID="cri-o://9463ca9a976be5cf6d28f485c0eb78ff286439c3b1f2bd8cdc8e765224202d78" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983448 4895 generic.go:334] "Generic (PLEG): container finished" podID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerID="64dfb20695cc21cb5780a961102900a3efbf2872b128113c0c165809758d6d46" exitCode=0 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983784 4895 generic.go:334] "Generic (PLEG): container finished" podID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerID="20cdb5c5d86687cb4d34ea95d0eaa6a21477df07f9810fa048df7e13652f4f04" exitCode=2 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983799 4895 generic.go:334] "Generic (PLEG): container finished" podID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerID="f46d03abb83b1eef1e620dbf8ff72082789c2d6d1b23a10ec39186adabeb853f" exitCode=0 Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983824 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerDied","Data":"64dfb20695cc21cb5780a961102900a3efbf2872b128113c0c165809758d6d46"} Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983855 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerDied","Data":"20cdb5c5d86687cb4d34ea95d0eaa6a21477df07f9810fa048df7e13652f4f04"} Mar 20 13:45:49 crc kubenswrapper[4895]: I0320 13:45:49.983869 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerDied","Data":"f46d03abb83b1eef1e620dbf8ff72082789c2d6d1b23a10ec39186adabeb853f"} Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.015375 4895 generic.go:334] "Generic (PLEG): container finished" podID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerID="9463ca9a976be5cf6d28f485c0eb78ff286439c3b1f2bd8cdc8e765224202d78" exitCode=0 Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.015720 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerDied","Data":"9463ca9a976be5cf6d28f485c0eb78ff286439c3b1f2bd8cdc8e765224202d78"} Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.818500 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860148 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860194 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860339 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860361 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll5gv\" (UniqueName: \"kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv\") pod \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\" (UID: \"1996ba33-da57-45d1-bb4d-eef80d7cb60c\") " Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860513 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.860703 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.861123 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.861144 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1996ba33-da57-45d1-bb4d-eef80d7cb60c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.889828 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv" (OuterVolumeSpecName: "kube-api-access-ll5gv") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "kube-api-access-ll5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.891174 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts" (OuterVolumeSpecName: "scripts") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.947437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.964662 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.965867 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.965940 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll5gv\" (UniqueName: \"kubernetes.io/projected/1996ba33-da57-45d1-bb4d-eef80d7cb60c-kube-api-access-ll5gv\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.965995 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.966055 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4895]: I0320 13:45:51.977240 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.027707 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data" (OuterVolumeSpecName: "config-data") pod "1996ba33-da57-45d1-bb4d-eef80d7cb60c" (UID: "1996ba33-da57-45d1-bb4d-eef80d7cb60c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.033082 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1996ba33-da57-45d1-bb4d-eef80d7cb60c","Type":"ContainerDied","Data":"3cac05fdf70fb2190639ba84f4d1587bc3945e92c6f2a427302726972dc9f030"} Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.033130 4895 scope.go:117] "RemoveContainer" containerID="64dfb20695cc21cb5780a961102900a3efbf2872b128113c0c165809758d6d46" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.033270 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.067939 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.067971 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1996ba33-da57-45d1-bb4d-eef80d7cb60c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.142903 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.166296 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.170242 4895 scope.go:117] "RemoveContainer" containerID="20cdb5c5d86687cb4d34ea95d0eaa6a21477df07f9810fa048df7e13652f4f04" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.183640 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:52 crc kubenswrapper[4895]: E0320 13:45:52.184060 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-central-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184073 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-central-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: E0320 13:45:52.184091 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="sg-core" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184097 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="sg-core" Mar 20 13:45:52 crc kubenswrapper[4895]: E0320 13:45:52.184118 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-notification-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184126 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-notification-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: E0320 13:45:52.184143 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="proxy-httpd" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184149 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="proxy-httpd" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184462 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-notification-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184495 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="proxy-httpd" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184508 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="ceilometer-central-agent" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.184516 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" containerName="sg-core" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.187139 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.191788 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.191799 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.191793 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.194032 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.200685 4895 scope.go:117] "RemoveContainer" containerID="9463ca9a976be5cf6d28f485c0eb78ff286439c3b1f2bd8cdc8e765224202d78" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.248156 4895 scope.go:117] "RemoveContainer" containerID="f46d03abb83b1eef1e620dbf8ff72082789c2d6d1b23a10ec39186adabeb853f" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.271707 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.271773 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-scripts\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.271819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbnhs\" (UniqueName: \"kubernetes.io/projected/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-kube-api-access-pbnhs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.271858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.272702 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.272738 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-config-data\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.272946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.273259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.375627 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.375673 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-config-data\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.375724 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.375802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.375958 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.376350 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-run-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.376401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-scripts\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.376437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbnhs\" (UniqueName: \"kubernetes.io/projected/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-kube-api-access-pbnhs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.376465 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.376740 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-log-httpd\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.381208 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.381377 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.381523 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.382666 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-scripts\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.398973 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-config-data\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.403141 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbnhs\" (UniqueName: \"kubernetes.io/projected/1952c8e8-d8db-4bf4-81b5-57be48de5cbc-kube-api-access-pbnhs\") pod \"ceilometer-0\" (UID: \"1952c8e8-d8db-4bf4-81b5-57be48de5cbc\") " pod="openstack/ceilometer-0" Mar 20 13:45:52 crc kubenswrapper[4895]: I0320 13:45:52.537408 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:53 crc kubenswrapper[4895]: I0320 13:45:53.034643 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:53 crc kubenswrapper[4895]: W0320 13:45:53.054264 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1952c8e8_d8db_4bf4_81b5_57be48de5cbc.slice/crio-7990ec51338ffeeb26b3743a66ae6f946ebff7edb195f799a55e7b9015f4809d WatchSource:0}: Error finding container 7990ec51338ffeeb26b3743a66ae6f946ebff7edb195f799a55e7b9015f4809d: Status 404 returned error can't find the container with id 7990ec51338ffeeb26b3743a66ae6f946ebff7edb195f799a55e7b9015f4809d Mar 20 13:45:53 crc kubenswrapper[4895]: I0320 13:45:53.223508 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1996ba33-da57-45d1-bb4d-eef80d7cb60c" path="/var/lib/kubelet/pods/1996ba33-da57-45d1-bb4d-eef80d7cb60c/volumes" Mar 20 13:45:53 crc kubenswrapper[4895]: I0320 13:45:53.747840 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="rabbitmq" containerID="cri-o://7697587e287762ce47515f74218184115e63cfb97792b8724a3bff895729b31a" gracePeriod=604796 Mar 20 13:45:53 crc kubenswrapper[4895]: I0320 13:45:53.775783 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" containerID="cri-o://edef95c7aadc2de12b902612def468d8cf92db96635227593d1fc4c8cf48f79d" gracePeriod=604796 Mar 20 13:45:54 crc kubenswrapper[4895]: I0320 13:45:54.114461 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1952c8e8-d8db-4bf4-81b5-57be48de5cbc","Type":"ContainerStarted","Data":"7990ec51338ffeeb26b3743a66ae6f946ebff7edb195f799a55e7b9015f4809d"} Mar 20 13:45:55 crc kubenswrapper[4895]: I0320 13:45:55.192101 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Mar 20 13:45:55 crc kubenswrapper[4895]: I0320 13:45:55.498499 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.141967 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-s5pxf"] Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.143804 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.147780 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.147881 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.147949 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.152299 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-s5pxf"] Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.193979 4895 generic.go:334] "Generic (PLEG): container finished" podID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerID="7697587e287762ce47515f74218184115e63cfb97792b8724a3bff895729b31a" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.194075 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerDied","Data":"7697587e287762ce47515f74218184115e63cfb97792b8724a3bff895729b31a"} Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.197859 4895 generic.go:334] "Generic (PLEG): container finished" podID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerID="edef95c7aadc2de12b902612def468d8cf92db96635227593d1fc4c8cf48f79d" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.197896 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerDied","Data":"edef95c7aadc2de12b902612def468d8cf92db96635227593d1fc4c8cf48f79d"} Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.271095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4md8m\" (UniqueName: \"kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m\") pod \"auto-csr-approver-29566906-s5pxf\" (UID: \"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f\") " pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.373863 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4md8m\" (UniqueName: \"kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m\") pod \"auto-csr-approver-29566906-s5pxf\" (UID: \"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f\") " pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.402509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4md8m\" (UniqueName: \"kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m\") pod \"auto-csr-approver-29566906-s5pxf\" (UID: \"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f\") " pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:00 crc kubenswrapper[4895]: I0320 13:46:00.463884 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.169777 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.171994 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.176127 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.183145 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347114 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347220 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347270 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfxm\" (UniqueName: \"kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347306 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347334 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.347359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449827 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449852 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfxm\" (UniqueName: \"kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.449906 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.450537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.450567 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.450601 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.450855 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.452434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.452486 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.471842 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfxm\" (UniqueName: \"kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm\") pod \"dnsmasq-dns-dc7c944bf-rkt9d\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:03 crc kubenswrapper[4895]: I0320 13:46:03.489615 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.249676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89838b7e-6fb3-4105-b03f-1f812f9ec514","Type":"ContainerDied","Data":"e7c115aca404b1360a94b3f6f5fee2200a739de32d7248fc9c2801936a52ae8d"} Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.250023 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c115aca404b1360a94b3f6f5fee2200a739de32d7248fc9c2801936a52ae8d" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.259675 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8","Type":"ContainerDied","Data":"658f3576af50f1584711ea10ef11af77140ffd413ea8eb4a8544af05fc420721"} Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.259711 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658f3576af50f1584711ea10ef11af77140ffd413ea8eb4a8544af05fc420721" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.260096 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.266887 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.312835 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313222 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313484 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313506 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313544 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313606 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313641 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.313684 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.316609 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319719 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319807 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319832 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-622j2\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319865 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319890 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.319933 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.320008 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.321548 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.321597 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b88hx\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.322783 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.324712 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.334311 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info" (OuterVolumeSpecName: "pod-info") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.339029 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.341175 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.341454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.349090 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx" (OuterVolumeSpecName: "kube-api-access-b88hx") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "kube-api-access-b88hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.353619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.371676 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2" (OuterVolumeSpecName: "kube-api-access-622j2") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "kube-api-access-622j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.373598 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info" (OuterVolumeSpecName: "pod-info") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.380834 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.413358 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data" (OuterVolumeSpecName: "config-data") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.424788 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.424850 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie\") pod \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\" (UID: \"70d3d6b0-04b6-4b47-bd85-2fa9212b68a8\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.424888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.424976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791" (OuterVolumeSpecName: "persistence") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: E0320 13:46:05.425227 4895 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/vol_data.json]: open /var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"89838b7e-6fb3-4105-b03f-1f812f9ec514\" (UID: \"89838b7e-6fb3-4105-b03f-1f812f9ec514\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/vol_data.json]: open /var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes/kubernetes.io~csi/pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791/vol_data.json: no such file or directory" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.427722 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428092 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428135 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") on node \"crc\" " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428157 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428168 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-622j2\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-kube-api-access-622j2\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428181 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428190 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89838b7e-6fb3-4105-b03f-1f812f9ec514-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428204 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b88hx\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-kube-api-access-b88hx\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428215 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428224 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428232 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89838b7e-6fb3-4105-b03f-1f812f9ec514-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428243 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428252 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.428434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.491634 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.492268 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data" (OuterVolumeSpecName: "config-data") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.501069 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f" (OuterVolumeSpecName: "persistence") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.531426 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.531473 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") on node \"crc\" " Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.531486 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.531495 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.531506 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.576348 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf" (OuterVolumeSpecName: "server-conf") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.604496 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.604658 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791") on node "crc" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.616357 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf" (OuterVolumeSpecName: "server-conf") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.622960 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.624913 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f") on node "crc" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.633462 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.633489 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.633499 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89838b7e-6fb3-4105-b03f-1f812f9ec514-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.633507 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.645992 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "89838b7e-6fb3-4105-b03f-1f812f9ec514" (UID: "89838b7e-6fb3-4105-b03f-1f812f9ec514"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.691889 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" (UID: "70d3d6b0-04b6-4b47-bd85-2fa9212b68a8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.735790 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89838b7e-6fb3-4105-b03f-1f812f9ec514-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4895]: I0320 13:46:05.736017 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.268695 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.268725 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.316125 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.324041 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.335218 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.346185 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.365650 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: E0320 13:46:06.366221 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366281 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: E0320 13:46:06.366334 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="setup-container" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366381 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="setup-container" Mar 20 13:46:06 crc kubenswrapper[4895]: E0320 13:46:06.366482 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="setup-container" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366531 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="setup-container" Mar 20 13:46:06 crc kubenswrapper[4895]: E0320 13:46:06.366607 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366656 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366891 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.366956 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.368097 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.381868 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.390411 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.393360 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.393841 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.393997 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9hmzl" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394107 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394363 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394492 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394631 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7plls" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394699 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394823 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394916 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.395025 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.395086 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.394662 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.395257 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.395059 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.401163 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549444 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa162ed3-a588-406c-a81e-5aafc5a82d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549745 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549801 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549864 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549906 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549930 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.549961 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550003 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjgqw\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-kube-api-access-sjgqw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550096 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550190 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa162ed3-a588-406c-a81e-5aafc5a82d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550550 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550697 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5589c\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-kube-api-access-5589c\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550719 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.550738 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652596 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjgqw\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-kube-api-access-sjgqw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652676 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652749 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652771 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652793 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa162ed3-a588-406c-a81e-5aafc5a82d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652838 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652925 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5589c\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-kube-api-access-5589c\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652941 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.652986 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653026 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa162ed3-a588-406c-a81e-5aafc5a82d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653062 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653077 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653101 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.653158 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.654222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.654460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.654523 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.655046 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.655170 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa162ed3-a588-406c-a81e-5aafc5a82d05-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.655299 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.659497 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa162ed3-a588-406c-a81e-5aafc5a82d05-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.667733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.667983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.668722 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.668872 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.669792 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.670552 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.672862 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.672900 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf58efcf1f086a4b5a46ed60249900a178edf090a90489330013a00504335efb/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.672944 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.672977 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1ed9a52ad5555824dc3a173b342b86aa900d19d5be3872519e1faa195be79d6d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.673531 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.680883 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa162ed3-a588-406c-a81e-5aafc5a82d05-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.681686 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.683948 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.684165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.688222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5589c\" (UniqueName: \"kubernetes.io/projected/6a6f84dd-56f2-4594-a3a0-bd428f57c6be-kube-api-access-5589c\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.689039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjgqw\" (UniqueName: \"kubernetes.io/projected/fa162ed3-a588-406c-a81e-5aafc5a82d05-kube-api-access-sjgqw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.737319 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2594cdb3-a64b-4fa9-a1d3-07a206cdb10f\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa162ed3-a588-406c-a81e-5aafc5a82d05\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:06 crc kubenswrapper[4895]: I0320 13:46:06.768935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0f2b10d-9a99-4528-a87a-b0d1fc165791\") pod \"rabbitmq-server-0\" (UID: \"6a6f84dd-56f2-4594-a3a0-bd428f57c6be\") " pod="openstack/rabbitmq-server-0" Mar 20 13:46:07 crc kubenswrapper[4895]: I0320 13:46:07.027534 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:07 crc kubenswrapper[4895]: I0320 13:46:07.039223 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:46:07 crc kubenswrapper[4895]: I0320 13:46:07.225052 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" path="/var/lib/kubelet/pods/70d3d6b0-04b6-4b47-bd85-2fa9212b68a8/volumes" Mar 20 13:46:07 crc kubenswrapper[4895]: I0320 13:46:07.227487 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89838b7e-6fb3-4105-b03f-1f812f9ec514" path="/var/lib/kubelet/pods/89838b7e-6fb3-4105-b03f-1f812f9ec514/volumes" Mar 20 13:46:10 crc kubenswrapper[4895]: I0320 13:46:10.193266 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="70d3d6b0-04b6-4b47-bd85-2fa9212b68a8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: i/o timeout" Mar 20 13:46:11 crc kubenswrapper[4895]: E0320 13:46:11.596018 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 20 13:46:11 crc kubenswrapper[4895]: E0320 13:46:11.596279 4895 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 20 13:46:11 crc kubenswrapper[4895]: E0320 13:46:11.596405 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dch66ch594hdbh5c6h89h85h656hdch78hbbh699h5ffhbbhbbhf7hc6h9bh548h8ch5dch96h54fh5bfh65ch596h5fch5fch58bh66h89h76q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbnhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1952c8e8-d8db-4bf4-81b5-57be48de5cbc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:46:12 crc kubenswrapper[4895]: I0320 13:46:12.248016 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-s5pxf"] Mar 20 13:46:12 crc kubenswrapper[4895]: E0320 13:46:12.475247 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 20 13:46:12 crc kubenswrapper[4895]: E0320 13:46:12.475294 4895 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 20 13:46:12 crc kubenswrapper[4895]: E0320 13:46:12.475419 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4bql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-d69kx_openstack(bb5ee4b2-1013-4687-b3aa-df5362f4b435): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:46:12 crc kubenswrapper[4895]: E0320 13:46:12.477933 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-d69kx" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.159129 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.170589 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:13 crc kubenswrapper[4895]: W0320 13:46:13.174531 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03db2022_e4fb_4a53_b757_d4d1fedb7e26.slice/crio-15e0422146d23cb801f898e2a9b330b146f099da1a62e08c4926088c9a2549c9 WatchSource:0}: Error finding container 15e0422146d23cb801f898e2a9b330b146f099da1a62e08c4926088c9a2549c9: Status 404 returned error can't find the container with id 15e0422146d23cb801f898e2a9b330b146f099da1a62e08c4926088c9a2549c9 Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.338265 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.362879 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a6f84dd-56f2-4594-a3a0-bd428f57c6be","Type":"ContainerStarted","Data":"d0b419b1916f4a6278a4394a919111431bf0332d2baed46301ab1136e73a6e95"} Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.370945 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1952c8e8-d8db-4bf4-81b5-57be48de5cbc","Type":"ContainerStarted","Data":"424709200b930de40b795bedf88c3f7ac1a615591b747c7bac85f8bfd3fcd7c6"} Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.381024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa162ed3-a588-406c-a81e-5aafc5a82d05","Type":"ContainerStarted","Data":"bcc181d2a82afbdeb0583b8d2438e398332112dcdd28cb2fac7dc5b98cb66890"} Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.382994 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" event={"ID":"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f","Type":"ContainerStarted","Data":"ea102da35833fde12d035b139d2474e3c3bf2ec436d227e19cde49a77aaffd0a"} Mar 20 13:46:13 crc kubenswrapper[4895]: I0320 13:46:13.384451 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" event={"ID":"03db2022-e4fb-4a53-b757-d4d1fedb7e26","Type":"ContainerStarted","Data":"15e0422146d23cb801f898e2a9b330b146f099da1a62e08c4926088c9a2549c9"} Mar 20 13:46:13 crc kubenswrapper[4895]: E0320 13:46:13.386451 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-d69kx" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.339618 4895 scope.go:117] "RemoveContainer" containerID="9313604646fb5e94a0fd71d1e10533c0d80e5848ddc14e2ff988fe81d1f51052" Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.374948 4895 scope.go:117] "RemoveContainer" containerID="0c4fa97276a3f95dbaa51409217ec0946b559dc47ad924a834afbb5c6fdf4646" Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.395410 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" event={"ID":"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f","Type":"ContainerStarted","Data":"d23ddf192605fa0e5bbf443c8ac0bd87dd021f31be6818e70058dcce6581d7fd"} Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.404866 4895 generic.go:334] "Generic (PLEG): container finished" podID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerID="e7620d7bba5a4c29c81d06f9b6c008ace67c808e345bb36ca699408aef6b7e3d" exitCode=0 Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.404927 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" event={"ID":"03db2022-e4fb-4a53-b757-d4d1fedb7e26","Type":"ContainerDied","Data":"e7620d7bba5a4c29c81d06f9b6c008ace67c808e345bb36ca699408aef6b7e3d"} Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.412361 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" podStartSLOduration=13.338852681 podStartE2EDuration="14.412343872s" podCreationTimestamp="2026-03-20 13:46:00 +0000 UTC" firstStartedPulling="2026-03-20 13:46:12.440728313 +0000 UTC m=+1471.950447289" lastFinishedPulling="2026-03-20 13:46:13.514219514 +0000 UTC m=+1473.023938480" observedRunningTime="2026-03-20 13:46:14.410673031 +0000 UTC m=+1473.920391997" watchObservedRunningTime="2026-03-20 13:46:14.412343872 +0000 UTC m=+1473.922062838" Mar 20 13:46:14 crc kubenswrapper[4895]: I0320 13:46:14.414987 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1952c8e8-d8db-4bf4-81b5-57be48de5cbc","Type":"ContainerStarted","Data":"37e16534c24ef35b807c6ade4406914cb79b15426db785d3002828ed37b83ed3"} Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.427002 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" event={"ID":"03db2022-e4fb-4a53-b757-d4d1fedb7e26","Type":"ContainerStarted","Data":"36d4cce3b7639082862f3e12cd38a27c91a29731e154929f05288c9be7293cc1"} Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.427550 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.428528 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a6f84dd-56f2-4594-a3a0-bd428f57c6be","Type":"ContainerStarted","Data":"5b1be597ab8859b278fbccda2158365fb96af0bd8fe76234f58c9ebecdf0147f"} Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.430974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa162ed3-a588-406c-a81e-5aafc5a82d05","Type":"ContainerStarted","Data":"f356fe94b0a4172b0e6feba6b33bd1d76532ae0377174e3f02d7728b7d7a8ccd"} Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.436515 4895 generic.go:334] "Generic (PLEG): container finished" podID="f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" containerID="d23ddf192605fa0e5bbf443c8ac0bd87dd021f31be6818e70058dcce6581d7fd" exitCode=0 Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.436550 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" event={"ID":"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f","Type":"ContainerDied","Data":"d23ddf192605fa0e5bbf443c8ac0bd87dd021f31be6818e70058dcce6581d7fd"} Mar 20 13:46:15 crc kubenswrapper[4895]: I0320 13:46:15.460331 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" podStartSLOduration=12.460299377 podStartE2EDuration="12.460299377s" podCreationTimestamp="2026-03-20 13:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:15.453840828 +0000 UTC m=+1474.963559804" watchObservedRunningTime="2026-03-20 13:46:15.460299377 +0000 UTC m=+1474.970018343" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.082763 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.217969 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4md8m\" (UniqueName: \"kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m\") pod \"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f\" (UID: \"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f\") " Mar 20 13:46:17 crc kubenswrapper[4895]: E0320 13:46:17.223455 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1952c8e8-d8db-4bf4-81b5-57be48de5cbc" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.235132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m" (OuterVolumeSpecName: "kube-api-access-4md8m") pod "f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" (UID: "f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f"). InnerVolumeSpecName "kube-api-access-4md8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.322194 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4md8m\" (UniqueName: \"kubernetes.io/projected/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f-kube-api-access-4md8m\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.460100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1952c8e8-d8db-4bf4-81b5-57be48de5cbc","Type":"ContainerStarted","Data":"a6bd245517b494021a8f10e5d916383c87e8a8cd797ae2ffad03cc22552bacab"} Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.461215 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:17 crc kubenswrapper[4895]: E0320 13:46:17.462139 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="1952c8e8-d8db-4bf4-81b5-57be48de5cbc" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.463703 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" event={"ID":"f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f","Type":"ContainerDied","Data":"ea102da35833fde12d035b139d2474e3c3bf2ec436d227e19cde49a77aaffd0a"} Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.463756 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea102da35833fde12d035b139d2474e3c3bf2ec436d227e19cde49a77aaffd0a" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.463749 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-s5pxf" Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.523584 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-4rshv"] Mar 20 13:46:17 crc kubenswrapper[4895]: I0320 13:46:17.550335 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-4rshv"] Mar 20 13:46:18 crc kubenswrapper[4895]: E0320 13:46:18.482221 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="1952c8e8-d8db-4bf4-81b5-57be48de5cbc" Mar 20 13:46:19 crc kubenswrapper[4895]: I0320 13:46:19.229940 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f341d72e-a04d-4f58-a7f9-bed0b19710ae" path="/var/lib/kubelet/pods/f341d72e-a04d-4f58-a7f9-bed0b19710ae/volumes" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.491598 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.553069 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.553334 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="dnsmasq-dns" containerID="cri-o://f413ac5d52a19b7db9b3724bb23581cfeee294e48610a7c08b8d88507fda8625" gracePeriod=10 Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.603614 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.234:5353: connect: connection refused" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.709489 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-p6pm5"] Mar 20 13:46:23 crc kubenswrapper[4895]: E0320 13:46:23.709919 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" containerName="oc" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.709935 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" containerName="oc" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.710134 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" containerName="oc" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.712444 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.735071 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-p6pm5"] Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.861947 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-config\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.862547 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.862650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2htw\" (UniqueName: \"kubernetes.io/projected/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-kube-api-access-w2htw\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.862722 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.862857 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.862984 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.863067 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964777 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964826 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2htw\" (UniqueName: \"kubernetes.io/projected/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-kube-api-access-w2htw\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964951 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.964968 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.965022 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-config\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.966039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-config\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.966121 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-openstack-edpm-ipam\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.966704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-sb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.966919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-svc\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.967380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-ovsdbserver-nb\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.967826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-dns-swift-storage-0\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:23 crc kubenswrapper[4895]: I0320 13:46:23.990242 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2htw\" (UniqueName: \"kubernetes.io/projected/2bdf219a-4e3a-448d-9624-bc31e07f1ad2-kube-api-access-w2htw\") pod \"dnsmasq-dns-c4b758ff5-p6pm5\" (UID: \"2bdf219a-4e3a-448d-9624-bc31e07f1ad2\") " pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.035623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.558820 4895 generic.go:334] "Generic (PLEG): container finished" podID="338f95a2-0180-49f2-80b4-46673037665a" containerID="f413ac5d52a19b7db9b3724bb23581cfeee294e48610a7c08b8d88507fda8625" exitCode=0 Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.559430 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" event={"ID":"338f95a2-0180-49f2-80b4-46673037665a","Type":"ContainerDied","Data":"f413ac5d52a19b7db9b3724bb23581cfeee294e48610a7c08b8d88507fda8625"} Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.559466 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" event={"ID":"338f95a2-0180-49f2-80b4-46673037665a","Type":"ContainerDied","Data":"912afc9994e1f3304ae43630cb1f0e55e5ba6581676cec12473b7b7b93a70f00"} Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.559497 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912afc9994e1f3304ae43630cb1f0e55e5ba6581676cec12473b7b7b93a70f00" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.593916 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783366 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783468 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783512 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq2b4\" (UniqueName: \"kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783742 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783895 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.783917 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0\") pod \"338f95a2-0180-49f2-80b4-46673037665a\" (UID: \"338f95a2-0180-49f2-80b4-46673037665a\") " Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.788456 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4" (OuterVolumeSpecName: "kube-api-access-tq2b4") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "kube-api-access-tq2b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.875062 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.875480 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.877723 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.879755 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.886623 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq2b4\" (UniqueName: \"kubernetes.io/projected/338f95a2-0180-49f2-80b4-46673037665a-kube-api-access-tq2b4\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.886669 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.886682 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.886694 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.886707 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.887470 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config" (OuterVolumeSpecName: "config") pod "338f95a2-0180-49f2-80b4-46673037665a" (UID: "338f95a2-0180-49f2-80b4-46673037665a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:24 crc kubenswrapper[4895]: W0320 13:46:24.913205 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bdf219a_4e3a_448d_9624_bc31e07f1ad2.slice/crio-659bdf3f03b9b5c3a08609d81cd761c37f69c3c16be9573e62ca9c30c161f0f5 WatchSource:0}: Error finding container 659bdf3f03b9b5c3a08609d81cd761c37f69c3c16be9573e62ca9c30c161f0f5: Status 404 returned error can't find the container with id 659bdf3f03b9b5c3a08609d81cd761c37f69c3c16be9573e62ca9c30c161f0f5 Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.913757 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4b758ff5-p6pm5"] Mar 20 13:46:24 crc kubenswrapper[4895]: I0320 13:46:24.988229 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338f95a2-0180-49f2-80b4-46673037665a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.571204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d69kx" event={"ID":"bb5ee4b2-1013-4687-b3aa-df5362f4b435","Type":"ContainerStarted","Data":"426631ec71ba496fbd3a725082cb1a64b438daaad90ae43f5d881da6219047f0"} Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.577678 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bdf219a-4e3a-448d-9624-bc31e07f1ad2" containerID="d9d55fbf2e59c92e15a085101d90a4c81524b2abc74796c67525196c50b9e0e0" exitCode=0 Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.577786 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dd998c-qjgm2" Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.577799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" event={"ID":"2bdf219a-4e3a-448d-9624-bc31e07f1ad2","Type":"ContainerDied","Data":"d9d55fbf2e59c92e15a085101d90a4c81524b2abc74796c67525196c50b9e0e0"} Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.577849 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" event={"ID":"2bdf219a-4e3a-448d-9624-bc31e07f1ad2","Type":"ContainerStarted","Data":"659bdf3f03b9b5c3a08609d81cd761c37f69c3c16be9573e62ca9c30c161f0f5"} Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.599697 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-d69kx" podStartSLOduration=2.7830465540000002 podStartE2EDuration="39.599674756s" podCreationTimestamp="2026-03-20 13:45:46 +0000 UTC" firstStartedPulling="2026-03-20 13:45:47.665204174 +0000 UTC m=+1447.174923140" lastFinishedPulling="2026-03-20 13:46:24.481832376 +0000 UTC m=+1483.991551342" observedRunningTime="2026-03-20 13:46:25.589157757 +0000 UTC m=+1485.098876733" watchObservedRunningTime="2026-03-20 13:46:25.599674756 +0000 UTC m=+1485.109393732" Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.618838 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:46:25 crc kubenswrapper[4895]: I0320 13:46:25.643326 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dd998c-qjgm2"] Mar 20 13:46:26 crc kubenswrapper[4895]: I0320 13:46:26.590435 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" event={"ID":"2bdf219a-4e3a-448d-9624-bc31e07f1ad2","Type":"ContainerStarted","Data":"ade03e170e3810747f81834247792bb9e897ba33dc65f51541bab8f92b29eceb"} Mar 20 13:46:26 crc kubenswrapper[4895]: I0320 13:46:26.590623 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:26 crc kubenswrapper[4895]: I0320 13:46:26.620319 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" podStartSLOduration=3.620302 podStartE2EDuration="3.620302s" podCreationTimestamp="2026-03-20 13:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:26.613654216 +0000 UTC m=+1486.123373202" watchObservedRunningTime="2026-03-20 13:46:26.620302 +0000 UTC m=+1486.130020976" Mar 20 13:46:27 crc kubenswrapper[4895]: I0320 13:46:27.227063 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338f95a2-0180-49f2-80b4-46673037665a" path="/var/lib/kubelet/pods/338f95a2-0180-49f2-80b4-46673037665a/volumes" Mar 20 13:46:27 crc kubenswrapper[4895]: I0320 13:46:27.601763 4895 generic.go:334] "Generic (PLEG): container finished" podID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" containerID="426631ec71ba496fbd3a725082cb1a64b438daaad90ae43f5d881da6219047f0" exitCode=0 Mar 20 13:46:27 crc kubenswrapper[4895]: I0320 13:46:27.601865 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d69kx" event={"ID":"bb5ee4b2-1013-4687-b3aa-df5362f4b435","Type":"ContainerDied","Data":"426631ec71ba496fbd3a725082cb1a64b438daaad90ae43f5d881da6219047f0"} Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.367583 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.481990 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data\") pod \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.482122 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4bql\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql\") pod \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.482184 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs\") pod \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.482214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle\") pod \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.482304 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts\") pod \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\" (UID: \"bb5ee4b2-1013-4687-b3aa-df5362f4b435\") " Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.493224 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql" (OuterVolumeSpecName: "kube-api-access-j4bql") pod "bb5ee4b2-1013-4687-b3aa-df5362f4b435" (UID: "bb5ee4b2-1013-4687-b3aa-df5362f4b435"). InnerVolumeSpecName "kube-api-access-j4bql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.507988 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts" (OuterVolumeSpecName: "scripts") pod "bb5ee4b2-1013-4687-b3aa-df5362f4b435" (UID: "bb5ee4b2-1013-4687-b3aa-df5362f4b435"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.522695 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs" (OuterVolumeSpecName: "certs") pod "bb5ee4b2-1013-4687-b3aa-df5362f4b435" (UID: "bb5ee4b2-1013-4687-b3aa-df5362f4b435"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.563594 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data" (OuterVolumeSpecName: "config-data") pod "bb5ee4b2-1013-4687-b3aa-df5362f4b435" (UID: "bb5ee4b2-1013-4687-b3aa-df5362f4b435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.564171 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb5ee4b2-1013-4687-b3aa-df5362f4b435" (UID: "bb5ee4b2-1013-4687-b3aa-df5362f4b435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.585033 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.585273 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4bql\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-kube-api-access-j4bql\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.585333 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb5ee4b2-1013-4687-b3aa-df5362f4b435-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.585385 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.585460 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5ee4b2-1013-4687-b3aa-df5362f4b435-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.621296 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d69kx" event={"ID":"bb5ee4b2-1013-4687-b3aa-df5362f4b435","Type":"ContainerDied","Data":"cd463d9d0e0758419ba03477b06d13d2f1f4d8ebc855d9541934a6bc0328fe2c"} Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.621345 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd463d9d0e0758419ba03477b06d13d2f1f4d8ebc855d9541934a6bc0328fe2c" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.621375 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d69kx" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.704470 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-wbsfq"] Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.716742 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-wbsfq"] Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.815860 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-8j7fc"] Mar 20 13:46:29 crc kubenswrapper[4895]: E0320 13:46:29.816357 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" containerName="cloudkitty-db-sync" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.816379 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" containerName="cloudkitty-db-sync" Mar 20 13:46:29 crc kubenswrapper[4895]: E0320 13:46:29.816420 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="init" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.816427 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="init" Mar 20 13:46:29 crc kubenswrapper[4895]: E0320 13:46:29.816445 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="dnsmasq-dns" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.816468 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="dnsmasq-dns" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.816788 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" containerName="cloudkitty-db-sync" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.816802 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="338f95a2-0180-49f2-80b4-46673037665a" containerName="dnsmasq-dns" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.817670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.827882 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.849152 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-8j7fc"] Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.993653 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.993774 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.993927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqmvw\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.994028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:29 crc kubenswrapper[4895]: I0320 13:46:29.994218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.095978 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqmvw\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.096035 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.096128 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.096171 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.096209 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.101408 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.101550 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.102290 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.102302 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.114733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqmvw\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw\") pod \"cloudkitty-storageinit-8j7fc\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.148938 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:30 crc kubenswrapper[4895]: I0320 13:46:30.653704 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-8j7fc"] Mar 20 13:46:31 crc kubenswrapper[4895]: I0320 13:46:31.227757 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb790d89-50de-47f6-9361-0c2f1bf39636" path="/var/lib/kubelet/pods/eb790d89-50de-47f6-9361-0c2f1bf39636/volumes" Mar 20 13:46:31 crc kubenswrapper[4895]: I0320 13:46:31.669845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-8j7fc" event={"ID":"9547e88e-4e6b-4034-a86e-d8145d5257e1","Type":"ContainerStarted","Data":"83b93b75d955dfefd2bfcebf3d45c221ddbe20c674c5c90be7b02296d67e3358"} Mar 20 13:46:31 crc kubenswrapper[4895]: I0320 13:46:31.669905 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-8j7fc" event={"ID":"9547e88e-4e6b-4034-a86e-d8145d5257e1","Type":"ContainerStarted","Data":"60bce8de2362d46e02c522846ead4adf2ee2a8405e797ba413782267413f5779"} Mar 20 13:46:31 crc kubenswrapper[4895]: I0320 13:46:31.687579 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-8j7fc" podStartSLOduration=2.687560338 podStartE2EDuration="2.687560338s" podCreationTimestamp="2026-03-20 13:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:31.685474886 +0000 UTC m=+1491.195193842" watchObservedRunningTime="2026-03-20 13:46:31.687560338 +0000 UTC m=+1491.197279304" Mar 20 13:46:32 crc kubenswrapper[4895]: I0320 13:46:32.689231 4895 generic.go:334] "Generic (PLEG): container finished" podID="9547e88e-4e6b-4034-a86e-d8145d5257e1" containerID="83b93b75d955dfefd2bfcebf3d45c221ddbe20c674c5c90be7b02296d67e3358" exitCode=0 Mar 20 13:46:32 crc kubenswrapper[4895]: I0320 13:46:32.689687 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-8j7fc" event={"ID":"9547e88e-4e6b-4034-a86e-d8145d5257e1","Type":"ContainerDied","Data":"83b93b75d955dfefd2bfcebf3d45c221ddbe20c674c5c90be7b02296d67e3358"} Mar 20 13:46:33 crc kubenswrapper[4895]: I0320 13:46:33.228902 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:46:33 crc kubenswrapper[4895]: I0320 13:46:33.704509 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1952c8e8-d8db-4bf4-81b5-57be48de5cbc","Type":"ContainerStarted","Data":"582d39a5e1f9052115fd5b135fd80142f94e088320f6e4997f76b78f2c536b51"} Mar 20 13:46:33 crc kubenswrapper[4895]: I0320 13:46:33.738396 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.409543619 podStartE2EDuration="41.738359159s" podCreationTimestamp="2026-03-20 13:45:52 +0000 UTC" firstStartedPulling="2026-03-20 13:45:53.061028927 +0000 UTC m=+1452.570747893" lastFinishedPulling="2026-03-20 13:46:33.389844427 +0000 UTC m=+1492.899563433" observedRunningTime="2026-03-20 13:46:33.727945664 +0000 UTC m=+1493.237664640" watchObservedRunningTime="2026-03-20 13:46:33.738359159 +0000 UTC m=+1493.248078145" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.037571 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c4b758ff5-p6pm5" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.095621 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.096052 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="dnsmasq-dns" containerID="cri-o://36d4cce3b7639082862f3e12cd38a27c91a29731e154929f05288c9be7293cc1" gracePeriod=10 Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.686336 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.724191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-8j7fc" event={"ID":"9547e88e-4e6b-4034-a86e-d8145d5257e1","Type":"ContainerDied","Data":"60bce8de2362d46e02c522846ead4adf2ee2a8405e797ba413782267413f5779"} Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.724230 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60bce8de2362d46e02c522846ead4adf2ee2a8405e797ba413782267413f5779" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.724765 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-8j7fc" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.739479 4895 generic.go:334] "Generic (PLEG): container finished" podID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerID="36d4cce3b7639082862f3e12cd38a27c91a29731e154929f05288c9be7293cc1" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.739527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" event={"ID":"03db2022-e4fb-4a53-b757-d4d1fedb7e26","Type":"ContainerDied","Data":"36d4cce3b7639082862f3e12cd38a27c91a29731e154929f05288c9be7293cc1"} Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.806988 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle\") pod \"9547e88e-4e6b-4034-a86e-d8145d5257e1\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.807068 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data\") pod \"9547e88e-4e6b-4034-a86e-d8145d5257e1\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.807329 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts\") pod \"9547e88e-4e6b-4034-a86e-d8145d5257e1\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.807358 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs\") pod \"9547e88e-4e6b-4034-a86e-d8145d5257e1\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.807445 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqmvw\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw\") pod \"9547e88e-4e6b-4034-a86e-d8145d5257e1\" (UID: \"9547e88e-4e6b-4034-a86e-d8145d5257e1\") " Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.812971 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs" (OuterVolumeSpecName: "certs") pod "9547e88e-4e6b-4034-a86e-d8145d5257e1" (UID: "9547e88e-4e6b-4034-a86e-d8145d5257e1"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.813566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw" (OuterVolumeSpecName: "kube-api-access-qqmvw") pod "9547e88e-4e6b-4034-a86e-d8145d5257e1" (UID: "9547e88e-4e6b-4034-a86e-d8145d5257e1"). InnerVolumeSpecName "kube-api-access-qqmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.830223 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts" (OuterVolumeSpecName: "scripts") pod "9547e88e-4e6b-4034-a86e-d8145d5257e1" (UID: "9547e88e-4e6b-4034-a86e-d8145d5257e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.873440 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data" (OuterVolumeSpecName: "config-data") pod "9547e88e-4e6b-4034-a86e-d8145d5257e1" (UID: "9547e88e-4e6b-4034-a86e-d8145d5257e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.882437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9547e88e-4e6b-4034-a86e-d8145d5257e1" (UID: "9547e88e-4e6b-4034-a86e-d8145d5257e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.909997 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.910027 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.910038 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqmvw\" (UniqueName: \"kubernetes.io/projected/9547e88e-4e6b-4034-a86e-d8145d5257e1-kube-api-access-qqmvw\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.910048 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.910056 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9547e88e-4e6b-4034-a86e-d8145d5257e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:34 crc kubenswrapper[4895]: I0320 13:46:34.977739 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116652 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116873 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116912 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116942 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnfxm\" (UniqueName: \"kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.116964 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb\") pod \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\" (UID: \"03db2022-e4fb-4a53-b757-d4d1fedb7e26\") " Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.130262 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm" (OuterVolumeSpecName: "kube-api-access-xnfxm") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "kube-api-access-xnfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.195282 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.197009 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.203096 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.209686 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config" (OuterVolumeSpecName: "config") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.220519 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.220766 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.220783 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.220795 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.220810 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnfxm\" (UniqueName: \"kubernetes.io/projected/03db2022-e4fb-4a53-b757-d4d1fedb7e26-kube-api-access-xnfxm\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.221473 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.228864 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03db2022-e4fb-4a53-b757-d4d1fedb7e26" (UID: "03db2022-e4fb-4a53-b757-d4d1fedb7e26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.323970 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.324000 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03db2022-e4fb-4a53-b757-d4d1fedb7e26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.764225 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" event={"ID":"03db2022-e4fb-4a53-b757-d4d1fedb7e26","Type":"ContainerDied","Data":"15e0422146d23cb801f898e2a9b330b146f099da1a62e08c4926088c9a2549c9"} Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.764590 4895 scope.go:117] "RemoveContainer" containerID="36d4cce3b7639082862f3e12cd38a27c91a29731e154929f05288c9be7293cc1" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.764282 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dc7c944bf-rkt9d" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.804171 4895 scope.go:117] "RemoveContainer" containerID="e7620d7bba5a4c29c81d06f9b6c008ace67c808e345bb36ca699408aef6b7e3d" Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.809962 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.829405 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.829668 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" containerName="cloudkitty-proc" containerID="cri-o://70792aa593d19fda325f57f3d40186af6a6ff83b0fe62091093ba7c5b5349a57" gracePeriod=30 Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.838776 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dc7c944bf-rkt9d"] Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.869897 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.870210 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api-log" containerID="cri-o://8b4d5890ef9b3f9c1c231bdec2d0c5de2fc4ce600260b1bfc48ce81b6162b5ad" gracePeriod=30 Mar 20 13:46:35 crc kubenswrapper[4895]: I0320 13:46:35.870443 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api" containerID="cri-o://846aa22b798b8c9b22182359f4929a0a90e89a0ed0b32e24bdd54e683361d419" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.360970 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.200:8889/healthcheck\": read tcp 10.217.0.2:56148->10.217.0.200:8889: read: connection reset by peer" Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.779183 4895 generic.go:334] "Generic (PLEG): container finished" podID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" containerID="70792aa593d19fda325f57f3d40186af6a6ff83b0fe62091093ba7c5b5349a57" exitCode=0 Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.779266 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517","Type":"ContainerDied","Data":"70792aa593d19fda325f57f3d40186af6a6ff83b0fe62091093ba7c5b5349a57"} Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.782671 4895 generic.go:334] "Generic (PLEG): container finished" podID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerID="846aa22b798b8c9b22182359f4929a0a90e89a0ed0b32e24bdd54e683361d419" exitCode=0 Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.782694 4895 generic.go:334] "Generic (PLEG): container finished" podID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerID="8b4d5890ef9b3f9c1c231bdec2d0c5de2fc4ce600260b1bfc48ce81b6162b5ad" exitCode=143 Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.782715 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerDied","Data":"846aa22b798b8c9b22182359f4929a0a90e89a0ed0b32e24bdd54e683361d419"} Mar 20 13:46:36 crc kubenswrapper[4895]: I0320 13:46:36.782736 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerDied","Data":"8b4d5890ef9b3f9c1c231bdec2d0c5de2fc4ce600260b1bfc48ce81b6162b5ad"} Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.222869 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" path="/var/lib/kubelet/pods/03db2022-e4fb-4a53-b757-d4d1fedb7e26/volumes" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.410894 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567103 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567204 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567241 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567276 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqb6v\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567351 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567398 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567429 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567472 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.567522 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom\") pod \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\" (UID: \"d1850d10-9153-42cc-93de-ef76e2d9a8c1\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.570075 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs" (OuterVolumeSpecName: "logs") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.574164 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs" (OuterVolumeSpecName: "certs") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.582372 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v" (OuterVolumeSpecName: "kube-api-access-rqb6v") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "kube-api-access-rqb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.585695 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts" (OuterVolumeSpecName: "scripts") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.609706 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.672841 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.673090 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqb6v\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-kube-api-access-rqb6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.673151 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d1850d10-9153-42cc-93de-ef76e2d9a8c1-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.673213 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.673267 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1850d10-9153-42cc-93de-ef76e2d9a8c1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.676824 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.678827 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data" (OuterVolumeSpecName: "config-data") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.687597 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.705702 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1850d10-9153-42cc-93de-ef76e2d9a8c1" (UID: "d1850d10-9153-42cc-93de-ef76e2d9a8c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.758731 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.781127 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.782156 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.782217 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.782271 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1850d10-9153-42cc-93de-ef76e2d9a8c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.807494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"d1850d10-9153-42cc-93de-ef76e2d9a8c1","Type":"ContainerDied","Data":"ea12950ed42a8cb692af598414c1d40bc22a29c033c07a5eda7640b39a839ded"} Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.807753 4895 scope.go:117] "RemoveContainer" containerID="846aa22b798b8c9b22182359f4929a0a90e89a0ed0b32e24bdd54e683361d419" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.807552 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.810812 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517","Type":"ContainerDied","Data":"1cbf7b1f9783f7870452cb65669b7801d298c8eaab0630c31ed72f6a3b5dd00a"} Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.810909 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.843931 4895 scope.go:117] "RemoveContainer" containerID="8b4d5890ef9b3f9c1c231bdec2d0c5de2fc4ce600260b1bfc48ce81b6162b5ad" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.857138 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.874166 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888036 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dtrn\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888106 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888124 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888231 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.888253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle\") pod \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\" (UID: \"a3ddc604-3d2a-45c7-99ae-2dd92d3d4517\") " Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.895061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.897497 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts" (OuterVolumeSpecName: "scripts") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.903700 4895 scope.go:117] "RemoveContainer" containerID="70792aa593d19fda325f57f3d40186af6a6ff83b0fe62091093ba7c5b5349a57" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.906795 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs" (OuterVolumeSpecName: "certs") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914142 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914786 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="init" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914800 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="init" Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914823 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547e88e-4e6b-4034-a86e-d8145d5257e1" containerName="cloudkitty-storageinit" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914829 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547e88e-4e6b-4034-a86e-d8145d5257e1" containerName="cloudkitty-storageinit" Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914845 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914852 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api" Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914864 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="dnsmasq-dns" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914871 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="dnsmasq-dns" Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914884 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api-log" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914890 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api-log" Mar 20 13:46:37 crc kubenswrapper[4895]: E0320 13:46:37.914900 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" containerName="cloudkitty-proc" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.914906 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" containerName="cloudkitty-proc" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.915069 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.915171 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" containerName="cloudkitty-api-log" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.915183 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="03db2022-e4fb-4a53-b757-d4d1fedb7e26" containerName="dnsmasq-dns" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.915196 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" containerName="cloudkitty-proc" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.915206 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9547e88e-4e6b-4034-a86e-d8145d5257e1" containerName="cloudkitty-storageinit" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.916656 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.918211 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn" (OuterVolumeSpecName: "kube-api-access-5dtrn") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "kube-api-access-5dtrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.921979 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.922220 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.922980 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.935331 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.943226 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data" (OuterVolumeSpecName: "config-data") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.991125 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.991353 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dtrn\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-kube-api-access-5dtrn\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.991461 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.991516 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4895]: I0320 13:46:37.991577 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.003578 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" (UID: "a3ddc604-3d2a-45c7-99ae-2dd92d3d4517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.093632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-scripts\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.093946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094085 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094236 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf2sl\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-kube-api-access-jf2sl\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094423 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094552 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094765 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094882 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.094927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8172bdb2-c101-4267-b041-46af02229c2c-logs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.095261 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.157680 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.172229 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.185793 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.187123 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.194097 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.196770 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.196873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf2sl\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-kube-api-access-jf2sl\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.196915 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.196938 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.196972 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.197005 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.197027 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8172bdb2-c101-4267-b041-46af02229c2c-logs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.197097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-scripts\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.197111 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.200439 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.201117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.203164 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.204042 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8172bdb2-c101-4267-b041-46af02229c2c-logs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.204447 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.207687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-scripts\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.209265 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-certs\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.212726 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8172bdb2-c101-4267-b041-46af02229c2c-config-data\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.220891 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.224933 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf2sl\" (UniqueName: \"kubernetes.io/projected/8172bdb2-c101-4267-b041-46af02229c2c-kube-api-access-jf2sl\") pod \"cloudkitty-api-0\" (UID: \"8172bdb2-c101-4267-b041-46af02229c2c\") " pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.266648 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299152 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299241 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-certs\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299477 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rkt\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-kube-api-access-f4rkt\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299540 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.299632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.402466 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4rkt\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-kube-api-access-f4rkt\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.402866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.402953 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.403047 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.403085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.403184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-certs\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.409606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.412869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.414273 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.415751 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-certs\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.420911 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4838ae67-efa2-48a2-86e7-1cb231be8eed-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.423006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4rkt\" (UniqueName: \"kubernetes.io/projected/4838ae67-efa2-48a2-86e7-1cb231be8eed-kube-api-access-f4rkt\") pod \"cloudkitty-proc-0\" (UID: \"4838ae67-efa2-48a2-86e7-1cb231be8eed\") " pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.517198 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.761477 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 20 13:46:38 crc kubenswrapper[4895]: I0320 13:46:38.834048 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8172bdb2-c101-4267-b041-46af02229c2c","Type":"ContainerStarted","Data":"8cf4b535b3eba8d5bc1d72ef52ddef1213718fb102b405f4549ff6dc17ab77b0"} Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.019820 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 20 13:46:39 crc kubenswrapper[4895]: W0320 13:46:39.029950 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4838ae67_efa2_48a2_86e7_1cb231be8eed.slice/crio-da4e0722766329e67c07324b6876f289b2cd5e68d1fbd9e261f33e330611885f WatchSource:0}: Error finding container da4e0722766329e67c07324b6876f289b2cd5e68d1fbd9e261f33e330611885f: Status 404 returned error can't find the container with id da4e0722766329e67c07324b6876f289b2cd5e68d1fbd9e261f33e330611885f Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.231774 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ddc604-3d2a-45c7-99ae-2dd92d3d4517" path="/var/lib/kubelet/pods/a3ddc604-3d2a-45c7-99ae-2dd92d3d4517/volumes" Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.232305 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1850d10-9153-42cc-93de-ef76e2d9a8c1" path="/var/lib/kubelet/pods/d1850d10-9153-42cc-93de-ef76e2d9a8c1/volumes" Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.846917 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8172bdb2-c101-4267-b041-46af02229c2c","Type":"ContainerStarted","Data":"c7b3b86d47f0dab07b80b45388b122796497c752407033b56fc6ae56ccea0e0f"} Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.847157 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8172bdb2-c101-4267-b041-46af02229c2c","Type":"ContainerStarted","Data":"c1b15b01562256b6500aa1724cef8244aa966f0da1d064d0a541db372e24b462"} Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.847300 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.848449 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"4838ae67-efa2-48a2-86e7-1cb231be8eed","Type":"ContainerStarted","Data":"da4e0722766329e67c07324b6876f289b2cd5e68d1fbd9e261f33e330611885f"} Mar 20 13:46:39 crc kubenswrapper[4895]: I0320 13:46:39.875993 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.875969963 podStartE2EDuration="2.875969963s" podCreationTimestamp="2026-03-20 13:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:39.864978863 +0000 UTC m=+1499.374697849" watchObservedRunningTime="2026-03-20 13:46:39.875969963 +0000 UTC m=+1499.385688939" Mar 20 13:46:40 crc kubenswrapper[4895]: I0320 13:46:40.863931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"4838ae67-efa2-48a2-86e7-1cb231be8eed","Type":"ContainerStarted","Data":"b637e903f603312c27a532430925c21ae84f8f5dc48c5e7cdd1ccb78ae0fc873"} Mar 20 13:46:40 crc kubenswrapper[4895]: I0320 13:46:40.891361 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.011858447 podStartE2EDuration="2.891334357s" podCreationTimestamp="2026-03-20 13:46:38 +0000 UTC" firstStartedPulling="2026-03-20 13:46:39.032868455 +0000 UTC m=+1498.542587421" lastFinishedPulling="2026-03-20 13:46:39.912344365 +0000 UTC m=+1499.422063331" observedRunningTime="2026-03-20 13:46:40.879526368 +0000 UTC m=+1500.389245344" watchObservedRunningTime="2026-03-20 13:46:40.891334357 +0000 UTC m=+1500.401053323" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.784953 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj"] Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.786901 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.791781 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.792179 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.792454 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.792653 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.816385 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj"] Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.915858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndq2r\" (UniqueName: \"kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.915919 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.915944 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:42 crc kubenswrapper[4895]: I0320 13:46:42.916508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.018103 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.018195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndq2r\" (UniqueName: \"kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.018241 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.018264 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.024982 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.031148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.045063 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.068134 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndq2r\" (UniqueName: \"kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.104830 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:46:43 crc kubenswrapper[4895]: I0320 13:46:43.902947 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj"] Mar 20 13:46:44 crc kubenswrapper[4895]: I0320 13:46:44.910796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" event={"ID":"57227803-046a-4dd7-8f7f-c93f09f2ab4c","Type":"ContainerStarted","Data":"49d270a99ea72376051142883e906c19160fe52d679fc88d165a8df81666f2a0"} Mar 20 13:46:46 crc kubenswrapper[4895]: I0320 13:46:46.955537 4895 generic.go:334] "Generic (PLEG): container finished" podID="6a6f84dd-56f2-4594-a3a0-bd428f57c6be" containerID="5b1be597ab8859b278fbccda2158365fb96af0bd8fe76234f58c9ebecdf0147f" exitCode=0 Mar 20 13:46:46 crc kubenswrapper[4895]: I0320 13:46:46.955639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a6f84dd-56f2-4594-a3a0-bd428f57c6be","Type":"ContainerDied","Data":"5b1be597ab8859b278fbccda2158365fb96af0bd8fe76234f58c9ebecdf0147f"} Mar 20 13:46:47 crc kubenswrapper[4895]: I0320 13:46:47.980634 4895 generic.go:334] "Generic (PLEG): container finished" podID="fa162ed3-a588-406c-a81e-5aafc5a82d05" containerID="f356fe94b0a4172b0e6feba6b33bd1d76532ae0377174e3f02d7728b7d7a8ccd" exitCode=0 Mar 20 13:46:47 crc kubenswrapper[4895]: I0320 13:46:47.980738 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa162ed3-a588-406c-a81e-5aafc5a82d05","Type":"ContainerDied","Data":"f356fe94b0a4172b0e6feba6b33bd1d76532ae0377174e3f02d7728b7d7a8ccd"} Mar 20 13:46:48 crc kubenswrapper[4895]: I0320 13:46:48.009635 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a6f84dd-56f2-4594-a3a0-bd428f57c6be","Type":"ContainerStarted","Data":"19d0be565993fedfa9ca28c6fb86cb6a6604f3282bebc015695340c12fbd4d4e"} Mar 20 13:46:48 crc kubenswrapper[4895]: I0320 13:46:48.010723 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:46:48 crc kubenswrapper[4895]: I0320 13:46:48.076522 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.076501076 podStartE2EDuration="42.076501076s" podCreationTimestamp="2026-03-20 13:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:48.05873099 +0000 UTC m=+1507.568449956" watchObservedRunningTime="2026-03-20 13:46:48.076501076 +0000 UTC m=+1507.586220052" Mar 20 13:46:49 crc kubenswrapper[4895]: I0320 13:46:49.021366 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa162ed3-a588-406c-a81e-5aafc5a82d05","Type":"ContainerStarted","Data":"d21dd936310b051836ba5e9e7b4b9d06ceb9552b8296a3eed8c579227ed1e631"} Mar 20 13:46:49 crc kubenswrapper[4895]: I0320 13:46:49.021949 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:46:49 crc kubenswrapper[4895]: I0320 13:46:49.058117 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.058094942 podStartE2EDuration="43.058094942s" podCreationTimestamp="2026-03-20 13:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:49.048674271 +0000 UTC m=+1508.558393237" watchObservedRunningTime="2026-03-20 13:46:49.058094942 +0000 UTC m=+1508.567813908" Mar 20 13:46:57 crc kubenswrapper[4895]: I0320 13:46:57.041311 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6a6f84dd-56f2-4594-a3a0-bd428f57c6be" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.247:5671: connect: connection refused" Mar 20 13:47:01 crc kubenswrapper[4895]: I0320 13:47:01.184808 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" event={"ID":"57227803-046a-4dd7-8f7f-c93f09f2ab4c","Type":"ContainerStarted","Data":"c742f23e8b5213002fe6c8264b2c2d6e824316f2ee88f143f293d1f6040e0dc6"} Mar 20 13:47:01 crc kubenswrapper[4895]: I0320 13:47:01.211684 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" podStartSLOduration=2.755120525 podStartE2EDuration="19.211654444s" podCreationTimestamp="2026-03-20 13:46:42 +0000 UTC" firstStartedPulling="2026-03-20 13:46:43.905779135 +0000 UTC m=+1503.415498101" lastFinishedPulling="2026-03-20 13:47:00.362313054 +0000 UTC m=+1519.872032020" observedRunningTime="2026-03-20 13:47:01.200251825 +0000 UTC m=+1520.709970801" watchObservedRunningTime="2026-03-20 13:47:01.211654444 +0000 UTC m=+1520.721373420" Mar 20 13:47:07 crc kubenswrapper[4895]: I0320 13:47:07.031572 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:47:07 crc kubenswrapper[4895]: I0320 13:47:07.041111 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:47:11 crc kubenswrapper[4895]: I0320 13:47:11.312343 4895 generic.go:334] "Generic (PLEG): container finished" podID="57227803-046a-4dd7-8f7f-c93f09f2ab4c" containerID="c742f23e8b5213002fe6c8264b2c2d6e824316f2ee88f143f293d1f6040e0dc6" exitCode=0 Mar 20 13:47:11 crc kubenswrapper[4895]: I0320 13:47:11.312437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" event={"ID":"57227803-046a-4dd7-8f7f-c93f09f2ab4c","Type":"ContainerDied","Data":"c742f23e8b5213002fe6c8264b2c2d6e824316f2ee88f143f293d1f6040e0dc6"} Mar 20 13:47:12 crc kubenswrapper[4895]: I0320 13:47:12.878428 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.010118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndq2r\" (UniqueName: \"kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r\") pod \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.010217 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory\") pod \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.010258 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle\") pod \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.010338 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam\") pod \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\" (UID: \"57227803-046a-4dd7-8f7f-c93f09f2ab4c\") " Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.016490 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "57227803-046a-4dd7-8f7f-c93f09f2ab4c" (UID: "57227803-046a-4dd7-8f7f-c93f09f2ab4c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.023631 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r" (OuterVolumeSpecName: "kube-api-access-ndq2r") pod "57227803-046a-4dd7-8f7f-c93f09f2ab4c" (UID: "57227803-046a-4dd7-8f7f-c93f09f2ab4c"). InnerVolumeSpecName "kube-api-access-ndq2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.046596 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57227803-046a-4dd7-8f7f-c93f09f2ab4c" (UID: "57227803-046a-4dd7-8f7f-c93f09f2ab4c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.049065 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory" (OuterVolumeSpecName: "inventory") pod "57227803-046a-4dd7-8f7f-c93f09f2ab4c" (UID: "57227803-046a-4dd7-8f7f-c93f09f2ab4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.115632 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndq2r\" (UniqueName: \"kubernetes.io/projected/57227803-046a-4dd7-8f7f-c93f09f2ab4c-kube-api-access-ndq2r\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.115675 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.115688 4895 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.115699 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57227803-046a-4dd7-8f7f-c93f09f2ab4c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.332017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" event={"ID":"57227803-046a-4dd7-8f7f-c93f09f2ab4c","Type":"ContainerDied","Data":"49d270a99ea72376051142883e906c19160fe52d679fc88d165a8df81666f2a0"} Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.332310 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d270a99ea72376051142883e906c19160fe52d679fc88d165a8df81666f2a0" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.332071 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.442040 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb"] Mar 20 13:47:13 crc kubenswrapper[4895]: E0320 13:47:13.442811 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57227803-046a-4dd7-8f7f-c93f09f2ab4c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.442918 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="57227803-046a-4dd7-8f7f-c93f09f2ab4c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.443250 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="57227803-046a-4dd7-8f7f-c93f09f2ab4c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.445599 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.450158 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.451018 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.455262 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.456526 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.490077 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb"] Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.523084 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.523442 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.523619 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8qz\" (UniqueName: \"kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.625495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.625591 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8qz\" (UniqueName: \"kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.625693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.630919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.644952 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8qz\" (UniqueName: \"kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.645551 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hhftb\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:13 crc kubenswrapper[4895]: I0320 13:47:13.771753 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.375095 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb"] Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.628254 4895 scope.go:117] "RemoveContainer" containerID="8967f215925a8aefb9eaf99d8d0dbb9a601aa8ea42640204b072a606c7c7403f" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.665296 4895 scope.go:117] "RemoveContainer" containerID="7697587e287762ce47515f74218184115e63cfb97792b8724a3bff895729b31a" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.717440 4895 scope.go:117] "RemoveContainer" containerID="7f0ecc47a978afc25c2a7716be49f21ca24938da5b6654d45c79de22b1b4e5a1" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.752190 4895 scope.go:117] "RemoveContainer" containerID="edef95c7aadc2de12b902612def468d8cf92db96635227593d1fc4c8cf48f79d" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.798122 4895 scope.go:117] "RemoveContainer" containerID="617002a65165215a129ed9952750302a16406334c1e8e358fe7833ca8dde6832" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.824755 4895 scope.go:117] "RemoveContainer" containerID="fae5152c45783fec6a8d79b16fb0f0570c62f212ef5dc69fac74017e2edd87bc" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.874542 4895 scope.go:117] "RemoveContainer" containerID="26785eee0c449721bcfafa52af3939655f25e08cd85cee66791863157bd2c4c9" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.946594 4895 scope.go:117] "RemoveContainer" containerID="27c97374a7acc4bbb9412bcb712b4beb51a28c4c64a1d5a4de424a262ffdba2f" Mar 20 13:47:14 crc kubenswrapper[4895]: I0320 13:47:14.971258 4895 scope.go:117] "RemoveContainer" containerID="9aabfb7c063f6233a078d4c562ccf0edf17a494752a0173dd120f4d4b03ed45d" Mar 20 13:47:15 crc kubenswrapper[4895]: I0320 13:47:15.370122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" event={"ID":"389115ab-14fd-4b1e-96a3-33453ff90899","Type":"ContainerStarted","Data":"508ea24e203fab0fc8faf7a5eedb5fe3b3127fcc7b56a3310141bdbcc9066b05"} Mar 20 13:47:15 crc kubenswrapper[4895]: I0320 13:47:15.370512 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" event={"ID":"389115ab-14fd-4b1e-96a3-33453ff90899","Type":"ContainerStarted","Data":"23b0a3f023a234098d9f03fef5e816ccb762887892275a1552c91eb3692f390b"} Mar 20 13:47:15 crc kubenswrapper[4895]: I0320 13:47:15.404486 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" podStartSLOduration=1.780593316 podStartE2EDuration="2.404465045s" podCreationTimestamp="2026-03-20 13:47:13 +0000 UTC" firstStartedPulling="2026-03-20 13:47:14.38829207 +0000 UTC m=+1533.898011036" lastFinishedPulling="2026-03-20 13:47:15.012163799 +0000 UTC m=+1534.521882765" observedRunningTime="2026-03-20 13:47:15.393358313 +0000 UTC m=+1534.903077279" watchObservedRunningTime="2026-03-20 13:47:15.404465045 +0000 UTC m=+1534.914184031" Mar 20 13:47:15 crc kubenswrapper[4895]: I0320 13:47:15.493517 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 20 13:47:18 crc kubenswrapper[4895]: I0320 13:47:18.402984 4895 generic.go:334] "Generic (PLEG): container finished" podID="389115ab-14fd-4b1e-96a3-33453ff90899" containerID="508ea24e203fab0fc8faf7a5eedb5fe3b3127fcc7b56a3310141bdbcc9066b05" exitCode=0 Mar 20 13:47:18 crc kubenswrapper[4895]: I0320 13:47:18.403033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" event={"ID":"389115ab-14fd-4b1e-96a3-33453ff90899","Type":"ContainerDied","Data":"508ea24e203fab0fc8faf7a5eedb5fe3b3127fcc7b56a3310141bdbcc9066b05"} Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.185244 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.286817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory\") pod \"389115ab-14fd-4b1e-96a3-33453ff90899\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.286895 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam\") pod \"389115ab-14fd-4b1e-96a3-33453ff90899\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.287002 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn8qz\" (UniqueName: \"kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz\") pod \"389115ab-14fd-4b1e-96a3-33453ff90899\" (UID: \"389115ab-14fd-4b1e-96a3-33453ff90899\") " Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.292372 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz" (OuterVolumeSpecName: "kube-api-access-gn8qz") pod "389115ab-14fd-4b1e-96a3-33453ff90899" (UID: "389115ab-14fd-4b1e-96a3-33453ff90899"). InnerVolumeSpecName "kube-api-access-gn8qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.319533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory" (OuterVolumeSpecName: "inventory") pod "389115ab-14fd-4b1e-96a3-33453ff90899" (UID: "389115ab-14fd-4b1e-96a3-33453ff90899"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.352633 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "389115ab-14fd-4b1e-96a3-33453ff90899" (UID: "389115ab-14fd-4b1e-96a3-33453ff90899"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.390248 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.390285 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/389115ab-14fd-4b1e-96a3-33453ff90899-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.390302 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn8qz\" (UniqueName: \"kubernetes.io/projected/389115ab-14fd-4b1e-96a3-33453ff90899-kube-api-access-gn8qz\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.426901 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" event={"ID":"389115ab-14fd-4b1e-96a3-33453ff90899","Type":"ContainerDied","Data":"23b0a3f023a234098d9f03fef5e816ccb762887892275a1552c91eb3692f390b"} Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.426950 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b0a3f023a234098d9f03fef5e816ccb762887892275a1552c91eb3692f390b" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.427046 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hhftb" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.584852 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx"] Mar 20 13:47:20 crc kubenswrapper[4895]: E0320 13:47:20.585506 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389115ab-14fd-4b1e-96a3-33453ff90899" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.585604 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="389115ab-14fd-4b1e-96a3-33453ff90899" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.585894 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="389115ab-14fd-4b1e-96a3-33453ff90899" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.586683 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.588952 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.588960 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.589307 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.589600 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.610903 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx"] Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.694926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs2tj\" (UniqueName: \"kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.695064 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.695090 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.695134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.796651 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.796779 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs2tj\" (UniqueName: \"kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.796916 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.796960 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.801010 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.801475 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.805802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.876451 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs2tj\" (UniqueName: \"kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:20 crc kubenswrapper[4895]: I0320 13:47:20.905599 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:47:21 crc kubenswrapper[4895]: I0320 13:47:21.559300 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx"] Mar 20 13:47:22 crc kubenswrapper[4895]: I0320 13:47:22.297655 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:22 crc kubenswrapper[4895]: I0320 13:47:22.297891 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:22 crc kubenswrapper[4895]: I0320 13:47:22.451014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" event={"ID":"80853d34-f97d-49e6-b582-3408214efe70","Type":"ContainerStarted","Data":"e5ab6617afe35963224591cfaf2c6d364273fc14619e9734cc6928955b5bcc90"} Mar 20 13:47:23 crc kubenswrapper[4895]: I0320 13:47:23.463311 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" event={"ID":"80853d34-f97d-49e6-b582-3408214efe70","Type":"ContainerStarted","Data":"d95097426b93eb7891319b0f394e19e35d8c6748c8147b22250cbe783f2f5260"} Mar 20 13:47:23 crc kubenswrapper[4895]: I0320 13:47:23.487352 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" podStartSLOduration=2.761161903 podStartE2EDuration="3.487332262s" podCreationTimestamp="2026-03-20 13:47:20 +0000 UTC" firstStartedPulling="2026-03-20 13:47:21.56696571 +0000 UTC m=+1541.076684676" lastFinishedPulling="2026-03-20 13:47:22.293136069 +0000 UTC m=+1541.802855035" observedRunningTime="2026-03-20 13:47:23.481317303 +0000 UTC m=+1542.991036269" watchObservedRunningTime="2026-03-20 13:47:23.487332262 +0000 UTC m=+1542.997051228" Mar 20 13:47:52 crc kubenswrapper[4895]: I0320 13:47:52.296974 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:52 crc kubenswrapper[4895]: I0320 13:47:52.297546 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.180325 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.184905 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.198306 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.278182 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.278324 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrf5x\" (UniqueName: \"kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.278454 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.379803 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.379980 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.380011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrf5x\" (UniqueName: \"kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.380502 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.380672 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.405324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrf5x\" (UniqueName: \"kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x\") pod \"redhat-marketplace-64vns\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:53 crc kubenswrapper[4895]: I0320 13:47:53.512025 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:47:54 crc kubenswrapper[4895]: I0320 13:47:54.078998 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:47:54 crc kubenswrapper[4895]: I0320 13:47:54.801449 4895 generic.go:334] "Generic (PLEG): container finished" podID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerID="de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e" exitCode=0 Mar 20 13:47:54 crc kubenswrapper[4895]: I0320 13:47:54.801489 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerDied","Data":"de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e"} Mar 20 13:47:54 crc kubenswrapper[4895]: I0320 13:47:54.801515 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerStarted","Data":"4697da41a63457eb44c0ab36f9cf86b289cb7fd9a0392075ded71b34c052ab52"} Mar 20 13:47:56 crc kubenswrapper[4895]: I0320 13:47:56.825306 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerStarted","Data":"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc"} Mar 20 13:47:57 crc kubenswrapper[4895]: I0320 13:47:57.845721 4895 generic.go:334] "Generic (PLEG): container finished" podID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerID="fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc" exitCode=0 Mar 20 13:47:57 crc kubenswrapper[4895]: I0320 13:47:57.845962 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerDied","Data":"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc"} Mar 20 13:47:58 crc kubenswrapper[4895]: I0320 13:47:58.857324 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerStarted","Data":"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd"} Mar 20 13:47:58 crc kubenswrapper[4895]: I0320 13:47:58.880227 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64vns" podStartSLOduration=2.440458512 podStartE2EDuration="5.880207206s" podCreationTimestamp="2026-03-20 13:47:53 +0000 UTC" firstStartedPulling="2026-03-20 13:47:54.803317198 +0000 UTC m=+1574.313036164" lastFinishedPulling="2026-03-20 13:47:58.243065892 +0000 UTC m=+1577.752784858" observedRunningTime="2026-03-20 13:47:58.874995168 +0000 UTC m=+1578.384714134" watchObservedRunningTime="2026-03-20 13:47:58.880207206 +0000 UTC m=+1578.389926172" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.148228 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-h8mbp"] Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.150019 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.153741 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.154182 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.154378 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.158825 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-h8mbp"] Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.316374 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf285\" (UniqueName: \"kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285\") pod \"auto-csr-approver-29566908-h8mbp\" (UID: \"749126f3-49e7-49b4-b8b5-b8a853df2990\") " pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.418841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf285\" (UniqueName: \"kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285\") pod \"auto-csr-approver-29566908-h8mbp\" (UID: \"749126f3-49e7-49b4-b8b5-b8a853df2990\") " pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.436973 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf285\" (UniqueName: \"kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285\") pod \"auto-csr-approver-29566908-h8mbp\" (UID: \"749126f3-49e7-49b4-b8b5-b8a853df2990\") " pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.501609 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:00 crc kubenswrapper[4895]: I0320 13:48:00.979307 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-h8mbp"] Mar 20 13:48:01 crc kubenswrapper[4895]: I0320 13:48:01.887765 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" event={"ID":"749126f3-49e7-49b4-b8b5-b8a853df2990","Type":"ContainerStarted","Data":"458c139f91f91ff3fe4182eb8c4308fc240ba2eb92b9fe9605fdce436de537fe"} Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.513614 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.513996 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.573838 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.908820 4895 generic.go:334] "Generic (PLEG): container finished" podID="749126f3-49e7-49b4-b8b5-b8a853df2990" containerID="32242598d8cc028cb5b3a1aeb42dd47a8860294cb002fbfabd43c40c388f646a" exitCode=0 Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.908858 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" event={"ID":"749126f3-49e7-49b4-b8b5-b8a853df2990","Type":"ContainerDied","Data":"32242598d8cc028cb5b3a1aeb42dd47a8860294cb002fbfabd43c40c388f646a"} Mar 20 13:48:03 crc kubenswrapper[4895]: I0320 13:48:03.963960 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:04 crc kubenswrapper[4895]: I0320 13:48:04.018094 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.661305 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.762212 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf285\" (UniqueName: \"kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285\") pod \"749126f3-49e7-49b4-b8b5-b8a853df2990\" (UID: \"749126f3-49e7-49b4-b8b5-b8a853df2990\") " Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.771701 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285" (OuterVolumeSpecName: "kube-api-access-gf285") pod "749126f3-49e7-49b4-b8b5-b8a853df2990" (UID: "749126f3-49e7-49b4-b8b5-b8a853df2990"). InnerVolumeSpecName "kube-api-access-gf285". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.864048 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf285\" (UniqueName: \"kubernetes.io/projected/749126f3-49e7-49b4-b8b5-b8a853df2990-kube-api-access-gf285\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.929172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" event={"ID":"749126f3-49e7-49b4-b8b5-b8a853df2990","Type":"ContainerDied","Data":"458c139f91f91ff3fe4182eb8c4308fc240ba2eb92b9fe9605fdce436de537fe"} Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.929218 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458c139f91f91ff3fe4182eb8c4308fc240ba2eb92b9fe9605fdce436de537fe" Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.929254 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-h8mbp" Mar 20 13:48:05 crc kubenswrapper[4895]: I0320 13:48:05.929361 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64vns" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="registry-server" containerID="cri-o://88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd" gracePeriod=2 Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.743502 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-98njk"] Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.755839 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-98njk"] Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.827340 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.888989 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrf5x\" (UniqueName: \"kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x\") pod \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.889181 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content\") pod \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.889216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities\") pod \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\" (UID: \"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000\") " Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.890127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities" (OuterVolumeSpecName: "utilities") pod "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" (UID: "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.898763 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x" (OuterVolumeSpecName: "kube-api-access-qrf5x") pod "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" (UID: "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000"). InnerVolumeSpecName "kube-api-access-qrf5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.931801 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" (UID: "1fb5135b-f8da-4a66-bd81-c8d1c1bc1000"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.943487 4895 generic.go:334] "Generic (PLEG): container finished" podID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerID="88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd" exitCode=0 Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.943533 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerDied","Data":"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd"} Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.943542 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64vns" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.943559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64vns" event={"ID":"1fb5135b-f8da-4a66-bd81-c8d1c1bc1000","Type":"ContainerDied","Data":"4697da41a63457eb44c0ab36f9cf86b289cb7fd9a0392075ded71b34c052ab52"} Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.943582 4895 scope.go:117] "RemoveContainer" containerID="88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.975502 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.977987 4895 scope.go:117] "RemoveContainer" containerID="fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.984873 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64vns"] Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.991332 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.991363 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:06 crc kubenswrapper[4895]: I0320 13:48:06.991373 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrf5x\" (UniqueName: \"kubernetes.io/projected/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000-kube-api-access-qrf5x\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.012211 4895 scope.go:117] "RemoveContainer" containerID="de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.049678 4895 scope.go:117] "RemoveContainer" containerID="88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd" Mar 20 13:48:07 crc kubenswrapper[4895]: E0320 13:48:07.050199 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd\": container with ID starting with 88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd not found: ID does not exist" containerID="88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.050245 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd"} err="failed to get container status \"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd\": rpc error: code = NotFound desc = could not find container \"88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd\": container with ID starting with 88ca97bc43977d8cbd197744a7266196496fa08e6ec7857fe552e07b16c80dcd not found: ID does not exist" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.050273 4895 scope.go:117] "RemoveContainer" containerID="fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc" Mar 20 13:48:07 crc kubenswrapper[4895]: E0320 13:48:07.051160 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc\": container with ID starting with fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc not found: ID does not exist" containerID="fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.051192 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc"} err="failed to get container status \"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc\": rpc error: code = NotFound desc = could not find container \"fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc\": container with ID starting with fe2e07dc6c9fe5a776bfe8072ece6e7b84a147ad729a00fbdb7867be6cfe9dfc not found: ID does not exist" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.051214 4895 scope.go:117] "RemoveContainer" containerID="de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e" Mar 20 13:48:07 crc kubenswrapper[4895]: E0320 13:48:07.051488 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e\": container with ID starting with de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e not found: ID does not exist" containerID="de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.051508 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e"} err="failed to get container status \"de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e\": rpc error: code = NotFound desc = could not find container \"de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e\": container with ID starting with de388dc463654e59deda051614b85aa11baf554060b8203136958cd095770d6e not found: ID does not exist" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.226676 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" path="/var/lib/kubelet/pods/1fb5135b-f8da-4a66-bd81-c8d1c1bc1000/volumes" Mar 20 13:48:07 crc kubenswrapper[4895]: I0320 13:48:07.227541 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9eaaed-76a2-47d6-9329-bc5b9bf34807" path="/var/lib/kubelet/pods/6a9eaaed-76a2-47d6-9329-bc5b9bf34807/volumes" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.367339 4895 scope.go:117] "RemoveContainer" containerID="10ba0e63cade1941d6e51c0c40c52d8889bce7905ccb960c7be7382455b1fe58" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.400284 4895 scope.go:117] "RemoveContainer" containerID="04fc448980f3e27eb78f718bee5920dc121fb9da3801182bbe986bded028d85e" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.441034 4895 scope.go:117] "RemoveContainer" containerID="59cca7b17788ca8ea30634268b6974726d95253e7da3534f6c7823783cfd65d0" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.500294 4895 scope.go:117] "RemoveContainer" containerID="ef58461765e9f8b579090cb930f74f94825b7ec2fbeba6897bc7f3647b750142" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.537290 4895 scope.go:117] "RemoveContainer" containerID="4d74b47199da609d5e4af7b760019a23f11e58eaa856c1d0a755ac299b0573f9" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.575285 4895 scope.go:117] "RemoveContainer" containerID="5752368e1ffd0d3bd077828c3cb696bacd1311baf8f07e8b1be78a8a1c4bf9e8" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.616602 4895 scope.go:117] "RemoveContainer" containerID="9f7f83eb76679943c56c5558fdcbfa456da1a32ecca1ec54a3839f8ef1a47fde" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.667112 4895 scope.go:117] "RemoveContainer" containerID="dce7d4ad54b99c0122cf432982a21667f0ad82c14bdbff8ac649104fd4998b95" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.692760 4895 scope.go:117] "RemoveContainer" containerID="e80caf46d08c27306713cb65c020842ec448f7d45ce4f0c5181e633195a1e1e7" Mar 20 13:48:15 crc kubenswrapper[4895]: I0320 13:48:15.741636 4895 scope.go:117] "RemoveContainer" containerID="412c2ec8e53ae785769748da243924f19737ec4340add704672428c36cf39b8b" Mar 20 13:48:22 crc kubenswrapper[4895]: I0320 13:48:22.297199 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:22 crc kubenswrapper[4895]: I0320 13:48:22.297771 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:22 crc kubenswrapper[4895]: I0320 13:48:22.297820 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:48:22 crc kubenswrapper[4895]: I0320 13:48:22.298615 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:48:22 crc kubenswrapper[4895]: I0320 13:48:22.298661 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" gracePeriod=600 Mar 20 13:48:22 crc kubenswrapper[4895]: E0320 13:48:22.421096 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:48:23 crc kubenswrapper[4895]: I0320 13:48:23.105516 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" exitCode=0 Mar 20 13:48:23 crc kubenswrapper[4895]: I0320 13:48:23.105575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732"} Mar 20 13:48:23 crc kubenswrapper[4895]: I0320 13:48:23.105622 4895 scope.go:117] "RemoveContainer" containerID="d09464565bb5144815482797fcbb93bafa376c2d41dc4366edc51c4ef8877edf" Mar 20 13:48:23 crc kubenswrapper[4895]: I0320 13:48:23.106410 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:48:23 crc kubenswrapper[4895]: E0320 13:48:23.106801 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:48:35 crc kubenswrapper[4895]: I0320 13:48:35.212443 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:48:35 crc kubenswrapper[4895]: E0320 13:48:35.213346 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:48:50 crc kubenswrapper[4895]: I0320 13:48:50.211491 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:48:50 crc kubenswrapper[4895]: E0320 13:48:50.212311 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:49:02 crc kubenswrapper[4895]: I0320 13:49:02.211790 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:49:02 crc kubenswrapper[4895]: E0320 13:49:02.213232 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:49:13 crc kubenswrapper[4895]: I0320 13:49:13.212091 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:49:13 crc kubenswrapper[4895]: E0320 13:49:13.212851 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:49:15 crc kubenswrapper[4895]: I0320 13:49:15.916844 4895 scope.go:117] "RemoveContainer" containerID="0ffe15514115b377c955503f6010d0347be260be2de9ff5930943a107a60892e" Mar 20 13:49:15 crc kubenswrapper[4895]: I0320 13:49:15.949812 4895 scope.go:117] "RemoveContainer" containerID="bcbca4e6a607ca47535a89f98e84b2dcdb6162ba12d0567391a7455feda65204" Mar 20 13:49:15 crc kubenswrapper[4895]: I0320 13:49:15.971430 4895 scope.go:117] "RemoveContainer" containerID="f3fe88ecd6d7e1c1ca06b312baed9c327b460504a0195a99f2642a02a3f1e305" Mar 20 13:49:15 crc kubenswrapper[4895]: I0320 13:49:15.998618 4895 scope.go:117] "RemoveContainer" containerID="a0106560817efe9b39b9f990c75ca23306d19c839c82ec7c9734edf6cfeb45b6" Mar 20 13:49:16 crc kubenswrapper[4895]: I0320 13:49:16.027975 4895 scope.go:117] "RemoveContainer" containerID="0dd97c50eae4b016f75ff6d52a35f8340ac869cfb11bbc8f632f3e50a9b5b0e5" Mar 20 13:49:16 crc kubenswrapper[4895]: I0320 13:49:16.074506 4895 scope.go:117] "RemoveContainer" containerID="dea42e680acad4d05684d3cac05722c24e32c1f8e0daf0745198451e0e2860c3" Mar 20 13:49:16 crc kubenswrapper[4895]: I0320 13:49:16.138343 4895 scope.go:117] "RemoveContainer" containerID="a8a34bb45f13f06b73de9c45ee24079b73349126dc38caec857cffc252977dd2" Mar 20 13:49:16 crc kubenswrapper[4895]: I0320 13:49:16.179363 4895 scope.go:117] "RemoveContainer" containerID="ec31ceca8278d8dfe9f061298e95a8c044f65b3cbc9b3f63a3e7fec5d113cbfe" Mar 20 13:49:27 crc kubenswrapper[4895]: I0320 13:49:27.211978 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:49:27 crc kubenswrapper[4895]: E0320 13:49:27.212728 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:49:42 crc kubenswrapper[4895]: I0320 13:49:42.211185 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:49:42 crc kubenswrapper[4895]: E0320 13:49:42.211907 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:49:56 crc kubenswrapper[4895]: I0320 13:49:56.211351 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:49:56 crc kubenswrapper[4895]: E0320 13:49:56.211976 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.161178 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-fxn58"] Mar 20 13:50:00 crc kubenswrapper[4895]: E0320 13:50:00.162046 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="extract-content" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162064 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="extract-content" Mar 20 13:50:00 crc kubenswrapper[4895]: E0320 13:50:00.162087 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749126f3-49e7-49b4-b8b5-b8a853df2990" containerName="oc" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162097 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="749126f3-49e7-49b4-b8b5-b8a853df2990" containerName="oc" Mar 20 13:50:00 crc kubenswrapper[4895]: E0320 13:50:00.162123 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="registry-server" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162129 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="registry-server" Mar 20 13:50:00 crc kubenswrapper[4895]: E0320 13:50:00.162150 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="extract-utilities" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162157 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="extract-utilities" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162419 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="749126f3-49e7-49b4-b8b5-b8a853df2990" containerName="oc" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.162438 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb5135b-f8da-4a66-bd81-c8d1c1bc1000" containerName="registry-server" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.163255 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.165741 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.165818 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.167477 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.173497 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-fxn58"] Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.258070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzpw\" (UniqueName: \"kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw\") pod \"auto-csr-approver-29566910-fxn58\" (UID: \"b2c47434-e191-4697-83c8-6bd904c9a2c7\") " pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.360611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzpw\" (UniqueName: \"kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw\") pod \"auto-csr-approver-29566910-fxn58\" (UID: \"b2c47434-e191-4697-83c8-6bd904c9a2c7\") " pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.400356 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzpw\" (UniqueName: \"kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw\") pod \"auto-csr-approver-29566910-fxn58\" (UID: \"b2c47434-e191-4697-83c8-6bd904c9a2c7\") " pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.479159 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:00 crc kubenswrapper[4895]: I0320 13:50:00.961910 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-fxn58"] Mar 20 13:50:01 crc kubenswrapper[4895]: I0320 13:50:01.106346 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-fxn58" event={"ID":"b2c47434-e191-4697-83c8-6bd904c9a2c7","Type":"ContainerStarted","Data":"b165f41ae2b4e2faf5dc8764ae5715e754d5168a0cf49c4130b64531197125ff"} Mar 20 13:50:03 crc kubenswrapper[4895]: I0320 13:50:03.125962 4895 generic.go:334] "Generic (PLEG): container finished" podID="b2c47434-e191-4697-83c8-6bd904c9a2c7" containerID="95d81e8beb0e84e1165f14f28fe92796bd058b15bb3c7c946e84a36e0b62b075" exitCode=0 Mar 20 13:50:03 crc kubenswrapper[4895]: I0320 13:50:03.126077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-fxn58" event={"ID":"b2c47434-e191-4697-83c8-6bd904c9a2c7","Type":"ContainerDied","Data":"95d81e8beb0e84e1165f14f28fe92796bd058b15bb3c7c946e84a36e0b62b075"} Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.119900 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.155338 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xzpw\" (UniqueName: \"kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw\") pod \"b2c47434-e191-4697-83c8-6bd904c9a2c7\" (UID: \"b2c47434-e191-4697-83c8-6bd904c9a2c7\") " Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.173215 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw" (OuterVolumeSpecName: "kube-api-access-9xzpw") pod "b2c47434-e191-4697-83c8-6bd904c9a2c7" (UID: "b2c47434-e191-4697-83c8-6bd904c9a2c7"). InnerVolumeSpecName "kube-api-access-9xzpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.179594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-fxn58" event={"ID":"b2c47434-e191-4697-83c8-6bd904c9a2c7","Type":"ContainerDied","Data":"b165f41ae2b4e2faf5dc8764ae5715e754d5168a0cf49c4130b64531197125ff"} Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.179820 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b165f41ae2b4e2faf5dc8764ae5715e754d5168a0cf49c4130b64531197125ff" Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.179633 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-fxn58" Mar 20 13:50:05 crc kubenswrapper[4895]: I0320 13:50:05.258725 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xzpw\" (UniqueName: \"kubernetes.io/projected/b2c47434-e191-4697-83c8-6bd904c9a2c7-kube-api-access-9xzpw\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:06 crc kubenswrapper[4895]: I0320 13:50:06.198640 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-6m8fs"] Mar 20 13:50:06 crc kubenswrapper[4895]: I0320 13:50:06.209492 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-6m8fs"] Mar 20 13:50:07 crc kubenswrapper[4895]: I0320 13:50:07.240921 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998b603e-c3b3-48a4-8c84-db84434afa48" path="/var/lib/kubelet/pods/998b603e-c3b3-48a4-8c84-db84434afa48/volumes" Mar 20 13:50:10 crc kubenswrapper[4895]: I0320 13:50:10.212744 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:50:10 crc kubenswrapper[4895]: E0320 13:50:10.214082 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:50:14 crc kubenswrapper[4895]: I0320 13:50:14.283139 4895 generic.go:334] "Generic (PLEG): container finished" podID="80853d34-f97d-49e6-b582-3408214efe70" containerID="d95097426b93eb7891319b0f394e19e35d8c6748c8147b22250cbe783f2f5260" exitCode=0 Mar 20 13:50:14 crc kubenswrapper[4895]: I0320 13:50:14.283679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" event={"ID":"80853d34-f97d-49e6-b582-3408214efe70","Type":"ContainerDied","Data":"d95097426b93eb7891319b0f394e19e35d8c6748c8147b22250cbe783f2f5260"} Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.296560 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.331973 4895 scope.go:117] "RemoveContainer" containerID="dc60ae3f9fae0b7e021db4f4cddb0b76499aebb399d4feabb075ca1db475c83c" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.350346 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" event={"ID":"80853d34-f97d-49e6-b582-3408214efe70","Type":"ContainerDied","Data":"e5ab6617afe35963224591cfaf2c6d364273fc14619e9734cc6928955b5bcc90"} Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.350402 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5ab6617afe35963224591cfaf2c6d364273fc14619e9734cc6928955b5bcc90" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.350496 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.381105 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs2tj\" (UniqueName: \"kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj\") pod \"80853d34-f97d-49e6-b582-3408214efe70\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.381317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle\") pod \"80853d34-f97d-49e6-b582-3408214efe70\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.381450 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam\") pod \"80853d34-f97d-49e6-b582-3408214efe70\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.381485 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory\") pod \"80853d34-f97d-49e6-b582-3408214efe70\" (UID: \"80853d34-f97d-49e6-b582-3408214efe70\") " Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.395145 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "80853d34-f97d-49e6-b582-3408214efe70" (UID: "80853d34-f97d-49e6-b582-3408214efe70"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.409437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj" (OuterVolumeSpecName: "kube-api-access-xs2tj") pod "80853d34-f97d-49e6-b582-3408214efe70" (UID: "80853d34-f97d-49e6-b582-3408214efe70"). InnerVolumeSpecName "kube-api-access-xs2tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.411819 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w"] Mar 20 13:50:16 crc kubenswrapper[4895]: E0320 13:50:16.412297 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80853d34-f97d-49e6-b582-3408214efe70" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.412321 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="80853d34-f97d-49e6-b582-3408214efe70" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:50:16 crc kubenswrapper[4895]: E0320 13:50:16.412360 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c47434-e191-4697-83c8-6bd904c9a2c7" containerName="oc" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.412367 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c47434-e191-4697-83c8-6bd904c9a2c7" containerName="oc" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.412568 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="80853d34-f97d-49e6-b582-3408214efe70" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.412590 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c47434-e191-4697-83c8-6bd904c9a2c7" containerName="oc" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.413567 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.418831 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory" (OuterVolumeSpecName: "inventory") pod "80853d34-f97d-49e6-b582-3408214efe70" (UID: "80853d34-f97d-49e6-b582-3408214efe70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.423166 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w"] Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.446029 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80853d34-f97d-49e6-b582-3408214efe70" (UID: "80853d34-f97d-49e6-b582-3408214efe70"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.485574 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.485854 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.486002 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.486204 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.486293 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs2tj\" (UniqueName: \"kubernetes.io/projected/80853d34-f97d-49e6-b582-3408214efe70-kube-api-access-xs2tj\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.486437 4895 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.486529 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80853d34-f97d-49e6-b582-3408214efe70-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.589314 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.589379 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.589531 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.593077 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.593538 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.606143 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7l89w\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:16 crc kubenswrapper[4895]: I0320 13:50:16.846055 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:50:17 crc kubenswrapper[4895]: I0320 13:50:17.412066 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w"] Mar 20 13:50:18 crc kubenswrapper[4895]: I0320 13:50:18.369742 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" event={"ID":"d4ba85e6-8f8d-4f5e-9e05-48690b7da983","Type":"ContainerStarted","Data":"4582645d2d487b6410ddc1ab4fbca824d7fb72d88bbe58d400944ff78d410415"} Mar 20 13:50:19 crc kubenswrapper[4895]: I0320 13:50:19.379647 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" event={"ID":"d4ba85e6-8f8d-4f5e-9e05-48690b7da983","Type":"ContainerStarted","Data":"2e134abb8cbe379e2661079f79bdc9105e84e82336eb3a05a09eb80a2b10cb56"} Mar 20 13:50:19 crc kubenswrapper[4895]: I0320 13:50:19.405932 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" podStartSLOduration=2.692010634 podStartE2EDuration="3.405909885s" podCreationTimestamp="2026-03-20 13:50:16 +0000 UTC" firstStartedPulling="2026-03-20 13:50:17.432793231 +0000 UTC m=+1716.942512197" lastFinishedPulling="2026-03-20 13:50:18.146692482 +0000 UTC m=+1717.656411448" observedRunningTime="2026-03-20 13:50:19.394410332 +0000 UTC m=+1718.904129298" watchObservedRunningTime="2026-03-20 13:50:19.405909885 +0000 UTC m=+1718.915628861" Mar 20 13:50:23 crc kubenswrapper[4895]: I0320 13:50:23.212504 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:50:23 crc kubenswrapper[4895]: E0320 13:50:23.213374 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:50:37 crc kubenswrapper[4895]: I0320 13:50:37.212640 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:50:37 crc kubenswrapper[4895]: E0320 13:50:37.213228 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.048735 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kwm44"] Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.058336 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kwm44"] Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.070279 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e0bd-account-create-update-gh5pj"] Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.079606 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e0bd-account-create-update-gh5pj"] Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.223290 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60" path="/var/lib/kubelet/pods/b98a9bf6-0f6e-410a-8d4b-10bfea0d3b60/volumes" Mar 20 13:50:47 crc kubenswrapper[4895]: I0320 13:50:47.223952 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d9d49d-3f8e-4d52-9824-2f74e592d3ae" path="/var/lib/kubelet/pods/e7d9d49d-3f8e-4d52-9824-2f74e592d3ae/volumes" Mar 20 13:50:48 crc kubenswrapper[4895]: I0320 13:50:48.213062 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:50:48 crc kubenswrapper[4895]: E0320 13:50:48.213297 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.030940 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e0a-account-create-update-7gcwg"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.044971 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c7d5-account-create-update-tmrfb"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.064261 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7z6b6"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.074696 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-45kh2"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.084689 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9e0a-account-create-update-7gcwg"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.094109 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7z6b6"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.105079 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-45kh2"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.118808 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c7d5-account-create-update-tmrfb"] Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.222335 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89345a71-c6db-4cc8-9ee9-120d5cfd4426" path="/var/lib/kubelet/pods/89345a71-c6db-4cc8-9ee9-120d5cfd4426/volumes" Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.222965 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb" path="/var/lib/kubelet/pods/ba653d5b-44a2-4eb4-9e1b-96e4a4c353cb/volumes" Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.223582 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc3b9065-76cf-4c3c-b701-ae88033ddeef" path="/var/lib/kubelet/pods/cc3b9065-76cf-4c3c-b701-ae88033ddeef/volumes" Mar 20 13:50:55 crc kubenswrapper[4895]: I0320 13:50:55.224154 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d681d662-1e62-4fa1-bf4d-4e9740068509" path="/var/lib/kubelet/pods/d681d662-1e62-4fa1-bf4d-4e9740068509/volumes" Mar 20 13:50:59 crc kubenswrapper[4895]: I0320 13:50:59.221589 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:50:59 crc kubenswrapper[4895]: E0320 13:50:59.222468 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:51:14 crc kubenswrapper[4895]: I0320 13:51:14.212343 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:51:14 crc kubenswrapper[4895]: E0320 13:51:14.213162 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.049852 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mgd6t"] Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.062387 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mgd6t"] Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.477988 4895 scope.go:117] "RemoveContainer" containerID="b266d79b11df30f79b669f7739d8fc9c27814783f4bd346863655c320a264b01" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.507262 4895 scope.go:117] "RemoveContainer" containerID="0116942417cb73924596a9a0fd45b80fcbaf5fcf9ff322d28a0da38dc4c71b91" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.533721 4895 scope.go:117] "RemoveContainer" containerID="af1352100d2092d7137fe012465a11f10fd84f26bba6920925bdadc92648b206" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.595938 4895 scope.go:117] "RemoveContainer" containerID="f413ac5d52a19b7db9b3724bb23581cfeee294e48610a7c08b8d88507fda8625" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.614702 4895 scope.go:117] "RemoveContainer" containerID="3b37f8a33382926f888cb81376135ce08ee0ea083ac13984244f84562db2adf1" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.655288 4895 scope.go:117] "RemoveContainer" containerID="09879d3e625a0766e94659b2099c44ac187b1969ae5b9414269da3bb7c9b8950" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.679335 4895 scope.go:117] "RemoveContainer" containerID="530ebdf7ad28d0091f593b288034bec6e31f15676adc713f1f19cc87dd623256" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.714368 4895 scope.go:117] "RemoveContainer" containerID="8b21f079ce682199b9e1c0a665e2c8c19820761ad991113b5189521ab297d381" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.767205 4895 scope.go:117] "RemoveContainer" containerID="d8291838b6414fe683c44ffbbcea5f6d6dd47c85d02579b39298c4d98734e5cb" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.837240 4895 scope.go:117] "RemoveContainer" containerID="027d9374dd6a66e14a1b75c1f10fc190f5792d9aeae857860a9b2132aa6b5f13" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.872695 4895 scope.go:117] "RemoveContainer" containerID="e4f1ff0406fcff5005682e9cb3e1e1d4f08fc2fe98cf9ffa41b1dfd7321f129e" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.922150 4895 scope.go:117] "RemoveContainer" containerID="320c135305b056c45bcdce4715e74f103d36bea3b13f54d03433d7d4727b57fb" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.945186 4895 scope.go:117] "RemoveContainer" containerID="c2bd7abd45a1eac6a4fed95e69e392b11e558cd4b5288c5b73472e95783da3b4" Mar 20 13:51:16 crc kubenswrapper[4895]: I0320 13:51:16.977625 4895 scope.go:117] "RemoveContainer" containerID="dde1795b5e49f890b5ac5235c4e010476fc3d4d0cb274397a9870e258e34ef45" Mar 20 13:51:17 crc kubenswrapper[4895]: I0320 13:51:17.018311 4895 scope.go:117] "RemoveContainer" containerID="8aa9601c57067c257f81e2d5b1e61583cfebd23ed5f2326fdb664055f7308c10" Mar 20 13:51:17 crc kubenswrapper[4895]: I0320 13:51:17.037994 4895 scope.go:117] "RemoveContainer" containerID="57b2e886e9c279023fa9125c83c7df6541549fd46b46b0aad45694c4366391b8" Mar 20 13:51:17 crc kubenswrapper[4895]: I0320 13:51:17.223692 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646f3cd6-24a0-4418-83ea-7a1f6e9b3654" path="/var/lib/kubelet/pods/646f3cd6-24a0-4418-83ea-7a1f6e9b3654/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.040127 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-d2e6-account-create-update-glvld"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.049422 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-d2e6-account-create-update-glvld"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.058701 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e04e-account-create-update-pnxcr"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.068152 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qvfkh"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.077144 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-whgpp"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.086876 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0bb8-account-create-update-cjtrn"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.095951 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qvfkh"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.104572 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0bb8-account-create-update-cjtrn"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.113109 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-whgpp"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.122647 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-trwxf"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.130477 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e04e-account-create-update-pnxcr"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.141146 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6158-account-create-update-vthkt"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.151500 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7cjkn"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.160778 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-trwxf"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.172108 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7cjkn"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.180531 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6158-account-create-update-vthkt"] Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.223168 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01eb0348-778d-4efb-a1fc-32c5f653526f" path="/var/lib/kubelet/pods/01eb0348-778d-4efb-a1fc-32c5f653526f/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.223862 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef2dd7e-834a-4bdb-8947-bca7d65185da" path="/var/lib/kubelet/pods/0ef2dd7e-834a-4bdb-8947-bca7d65185da/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.224511 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3384f4a4-4c8a-4921-b17f-95f0568d32bc" path="/var/lib/kubelet/pods/3384f4a4-4c8a-4921-b17f-95f0568d32bc/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.225213 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4424f026-2489-4ade-bfc8-f2b711fede5d" path="/var/lib/kubelet/pods/4424f026-2489-4ade-bfc8-f2b711fede5d/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.228038 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f11268f-7299-492d-acbd-f04313e097d2" path="/var/lib/kubelet/pods/4f11268f-7299-492d-acbd-f04313e097d2/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.228775 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9c7cee-575a-4903-8c29-c977a78ac5f6" path="/var/lib/kubelet/pods/df9c7cee-575a-4903-8c29-c977a78ac5f6/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.229426 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e457c566-019a-4ce1-96ca-d1e4d1f8ff36" path="/var/lib/kubelet/pods/e457c566-019a-4ce1-96ca-d1e4d1f8ff36/volumes" Mar 20 13:51:23 crc kubenswrapper[4895]: I0320 13:51:23.234296 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67c072b-bfa6-4ddc-b8c6-ee96f07efc91" path="/var/lib/kubelet/pods/e67c072b-bfa6-4ddc-b8c6-ee96f07efc91/volumes" Mar 20 13:51:28 crc kubenswrapper[4895]: I0320 13:51:28.036508 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zwhhm"] Mar 20 13:51:28 crc kubenswrapper[4895]: I0320 13:51:28.059700 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zwhhm"] Mar 20 13:51:29 crc kubenswrapper[4895]: I0320 13:51:29.212492 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:51:29 crc kubenswrapper[4895]: E0320 13:51:29.213034 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:51:29 crc kubenswrapper[4895]: I0320 13:51:29.224028 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87249cf1-602d-4f80-976a-bc7a59bd4cfd" path="/var/lib/kubelet/pods/87249cf1-602d-4f80-976a-bc7a59bd4cfd/volumes" Mar 20 13:51:32 crc kubenswrapper[4895]: I0320 13:51:32.035445 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6wxs8"] Mar 20 13:51:32 crc kubenswrapper[4895]: I0320 13:51:32.046319 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6wxs8"] Mar 20 13:51:33 crc kubenswrapper[4895]: I0320 13:51:33.222278 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958c9f6e-e716-42e2-bb6a-9d44847f4525" path="/var/lib/kubelet/pods/958c9f6e-e716-42e2-bb6a-9d44847f4525/volumes" Mar 20 13:51:41 crc kubenswrapper[4895]: I0320 13:51:41.221118 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:51:41 crc kubenswrapper[4895]: E0320 13:51:41.221874 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:51:52 crc kubenswrapper[4895]: I0320 13:51:52.211995 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:51:52 crc kubenswrapper[4895]: E0320 13:51:52.212730 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:51:59 crc kubenswrapper[4895]: I0320 13:51:59.841955 4895 generic.go:334] "Generic (PLEG): container finished" podID="d4ba85e6-8f8d-4f5e-9e05-48690b7da983" containerID="2e134abb8cbe379e2661079f79bdc9105e84e82336eb3a05a09eb80a2b10cb56" exitCode=0 Mar 20 13:51:59 crc kubenswrapper[4895]: I0320 13:51:59.842001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" event={"ID":"d4ba85e6-8f8d-4f5e-9e05-48690b7da983","Type":"ContainerDied","Data":"2e134abb8cbe379e2661079f79bdc9105e84e82336eb3a05a09eb80a2b10cb56"} Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.153864 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-26czq"] Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.155359 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.158145 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.158550 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.159996 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.172149 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-26czq"] Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.356972 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmfx\" (UniqueName: \"kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx\") pod \"auto-csr-approver-29566912-26czq\" (UID: \"b2a03ef8-3463-4c6f-9415-37069d4bcbc9\") " pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.459360 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmfx\" (UniqueName: \"kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx\") pod \"auto-csr-approver-29566912-26czq\" (UID: \"b2a03ef8-3463-4c6f-9415-37069d4bcbc9\") " pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.478063 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmfx\" (UniqueName: \"kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx\") pod \"auto-csr-approver-29566912-26czq\" (UID: \"b2a03ef8-3463-4c6f-9415-37069d4bcbc9\") " pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:00 crc kubenswrapper[4895]: I0320 13:52:00.777257 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:01 crc kubenswrapper[4895]: I0320 13:52:01.509067 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-26czq"] Mar 20 13:52:01 crc kubenswrapper[4895]: I0320 13:52:01.526383 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:52:01 crc kubenswrapper[4895]: I0320 13:52:01.873889 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-26czq" event={"ID":"b2a03ef8-3463-4c6f-9415-37069d4bcbc9","Type":"ContainerStarted","Data":"aa10a10526a96c6b5c1f6ff918eb57dac6def571268bc315ed50b0d5627bcfec"} Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.110982 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.300599 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam\") pod \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.300746 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory\") pod \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.300887 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx\") pod \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\" (UID: \"d4ba85e6-8f8d-4f5e-9e05-48690b7da983\") " Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.315632 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx" (OuterVolumeSpecName: "kube-api-access-8vzpx") pod "d4ba85e6-8f8d-4f5e-9e05-48690b7da983" (UID: "d4ba85e6-8f8d-4f5e-9e05-48690b7da983"). InnerVolumeSpecName "kube-api-access-8vzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.334423 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d4ba85e6-8f8d-4f5e-9e05-48690b7da983" (UID: "d4ba85e6-8f8d-4f5e-9e05-48690b7da983"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.361734 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory" (OuterVolumeSpecName: "inventory") pod "d4ba85e6-8f8d-4f5e-9e05-48690b7da983" (UID: "d4ba85e6-8f8d-4f5e-9e05-48690b7da983"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.403699 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.403743 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-kube-api-access-8vzpx\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.403761 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4ba85e6-8f8d-4f5e-9e05-48690b7da983-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.891045 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" event={"ID":"d4ba85e6-8f8d-4f5e-9e05-48690b7da983","Type":"ContainerDied","Data":"4582645d2d487b6410ddc1ab4fbca824d7fb72d88bbe58d400944ff78d410415"} Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.891096 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4582645d2d487b6410ddc1ab4fbca824d7fb72d88bbe58d400944ff78d410415" Mar 20 13:52:02 crc kubenswrapper[4895]: I0320 13:52:02.891160 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7l89w" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.211497 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:52:03 crc kubenswrapper[4895]: E0320 13:52:03.211838 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.265300 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm"] Mar 20 13:52:03 crc kubenswrapper[4895]: E0320 13:52:03.265812 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ba85e6-8f8d-4f5e-9e05-48690b7da983" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.265831 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ba85e6-8f8d-4f5e-9e05-48690b7da983" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.266011 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ba85e6-8f8d-4f5e-9e05-48690b7da983" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.266751 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.269091 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.269177 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.269686 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.270340 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.274081 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm"] Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.424912 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.425250 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46jq\" (UniqueName: \"kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.425497 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.527575 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.527647 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.527751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46jq\" (UniqueName: \"kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.540657 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.543293 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.545040 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46jq\" (UniqueName: \"kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.583059 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.914514 4895 generic.go:334] "Generic (PLEG): container finished" podID="b2a03ef8-3463-4c6f-9415-37069d4bcbc9" containerID="9410c0ef5938c4d4069e870ba2f308e075f32e88d48b3fd28fa000eb9145e5f6" exitCode=0 Mar 20 13:52:03 crc kubenswrapper[4895]: I0320 13:52:03.914564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-26czq" event={"ID":"b2a03ef8-3463-4c6f-9415-37069d4bcbc9","Type":"ContainerDied","Data":"9410c0ef5938c4d4069e870ba2f308e075f32e88d48b3fd28fa000eb9145e5f6"} Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.046259 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9nkgt"] Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.059111 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9nkgt"] Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.124095 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm"] Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.924869 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" event={"ID":"f741586e-ce78-4057-8e0c-032310d4e3a4","Type":"ContainerStarted","Data":"3d5de3c2aaad2d327aeadbc88d15be08b559cb61fbe64764bd3d8c18af5785bc"} Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.924929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" event={"ID":"f741586e-ce78-4057-8e0c-032310d4e3a4","Type":"ContainerStarted","Data":"c2455c6a87c44131445cf07fc0e73e93474768f682a0563eec55f8cdc92c2314"} Mar 20 13:52:04 crc kubenswrapper[4895]: I0320 13:52:04.947200 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" podStartSLOduration=1.532939683 podStartE2EDuration="1.947181719s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.131529684 +0000 UTC m=+1823.641248650" lastFinishedPulling="2026-03-20 13:52:04.54577172 +0000 UTC m=+1824.055490686" observedRunningTime="2026-03-20 13:52:04.937823649 +0000 UTC m=+1824.447542615" watchObservedRunningTime="2026-03-20 13:52:04.947181719 +0000 UTC m=+1824.456900685" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.230836 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c43c76c-2573-4cce-880d-830e4fd8bed9" path="/var/lib/kubelet/pods/2c43c76c-2573-4cce-880d-830e4fd8bed9/volumes" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.643530 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.798039 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvmfx\" (UniqueName: \"kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx\") pod \"b2a03ef8-3463-4c6f-9415-37069d4bcbc9\" (UID: \"b2a03ef8-3463-4c6f-9415-37069d4bcbc9\") " Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.803603 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx" (OuterVolumeSpecName: "kube-api-access-vvmfx") pod "b2a03ef8-3463-4c6f-9415-37069d4bcbc9" (UID: "b2a03ef8-3463-4c6f-9415-37069d4bcbc9"). InnerVolumeSpecName "kube-api-access-vvmfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.900705 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvmfx\" (UniqueName: \"kubernetes.io/projected/b2a03ef8-3463-4c6f-9415-37069d4bcbc9-kube-api-access-vvmfx\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.933878 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-26czq" event={"ID":"b2a03ef8-3463-4c6f-9415-37069d4bcbc9","Type":"ContainerDied","Data":"aa10a10526a96c6b5c1f6ff918eb57dac6def571268bc315ed50b0d5627bcfec"} Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.933916 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-26czq" Mar 20 13:52:05 crc kubenswrapper[4895]: I0320 13:52:05.933921 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa10a10526a96c6b5c1f6ff918eb57dac6def571268bc315ed50b0d5627bcfec" Mar 20 13:52:06 crc kubenswrapper[4895]: I0320 13:52:06.697670 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-s5pxf"] Mar 20 13:52:06 crc kubenswrapper[4895]: I0320 13:52:06.710206 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-s5pxf"] Mar 20 13:52:07 crc kubenswrapper[4895]: I0320 13:52:07.223554 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f" path="/var/lib/kubelet/pods/f28d8dc4-bd59-4c95-a4de-c5ff101a4a1f/volumes" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.264805 4895 scope.go:117] "RemoveContainer" containerID="aabdf34cff7072c7ff33eff4f7c984dbe99a55804c109d098cb8b9a1fda59a3b" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.304784 4895 scope.go:117] "RemoveContainer" containerID="d23ddf192605fa0e5bbf443c8ac0bd87dd021f31be6818e70058dcce6581d7fd" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.385024 4895 scope.go:117] "RemoveContainer" containerID="fa35306d58e049177176882caa1719c8f32a63e8debd7da7dec0e87b3b63d162" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.422730 4895 scope.go:117] "RemoveContainer" containerID="f54979f6b4cc777dc85e235bc931356de05f27358c94c8a8281c5a55676230e3" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.457273 4895 scope.go:117] "RemoveContainer" containerID="f6cebf94a0f0945bf9181f0c02d805951ad9efe05aa2378d2b5b61f2ef3aa5f0" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.494253 4895 scope.go:117] "RemoveContainer" containerID="fe3846b2dc52b38399eacc3966352e405eb9814b9236f9f4089fa04ff0aaa3d4" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.538183 4895 scope.go:117] "RemoveContainer" containerID="9572e9d3da4b29d9fa2c08e844f4c2a9b44d437094ca2de97740ea956603724a" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.601561 4895 scope.go:117] "RemoveContainer" containerID="388b86e44b30393ecd25d8d99a467d833034f82e67c58ee3e5dc1934c217926e" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.625814 4895 scope.go:117] "RemoveContainer" containerID="38473f1938c42206a2af8d79bc350b6b7ffc283182278b6ef355a33d964ec4e4" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.652593 4895 scope.go:117] "RemoveContainer" containerID="2c6140c254c0e05435a4bc945323b53d2dd9a23780c4d58e3e879b969303f3a9" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.689162 4895 scope.go:117] "RemoveContainer" containerID="6de5c55fb64f6debf441bd70ddd5ac62086452e2cc05af35a00ad82c2a11f20d" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.721870 4895 scope.go:117] "RemoveContainer" containerID="b9a01c73df8552b87832259fd22d38089d241fa9a6f5bb3feea4bb455f727f0c" Mar 20 13:52:17 crc kubenswrapper[4895]: I0320 13:52:17.750107 4895 scope.go:117] "RemoveContainer" containerID="ec545a7e6ec7e33e26fa18a787f99b14bb3d0f9b3d70aba16d0465ef64bd0895" Mar 20 13:52:18 crc kubenswrapper[4895]: I0320 13:52:18.212341 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:52:18 crc kubenswrapper[4895]: E0320 13:52:18.212976 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.046025 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-m79dc"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.064232 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hnpjd"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.080196 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-clr45"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.094748 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-m79dc"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.104894 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hnpjd"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.117056 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-clr45"] Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.233946 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ff498d-af75-4603-8dbd-91a429e00cb8" path="/var/lib/kubelet/pods/05ff498d-af75-4603-8dbd-91a429e00cb8/volumes" Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.234552 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209c54a2-1964-481c-80e4-16eaef611f4e" path="/var/lib/kubelet/pods/209c54a2-1964-481c-80e4-16eaef611f4e/volumes" Mar 20 13:52:23 crc kubenswrapper[4895]: I0320 13:52:23.235093 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56a9ed0-b52b-42a4-a875-5d383303c91e" path="/var/lib/kubelet/pods/c56a9ed0-b52b-42a4-a875-5d383303c91e/volumes" Mar 20 13:52:33 crc kubenswrapper[4895]: I0320 13:52:33.211947 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:52:33 crc kubenswrapper[4895]: E0320 13:52:33.212727 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:52:34 crc kubenswrapper[4895]: I0320 13:52:34.031160 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-57bfc"] Mar 20 13:52:34 crc kubenswrapper[4895]: I0320 13:52:34.042600 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-57bfc"] Mar 20 13:52:35 crc kubenswrapper[4895]: I0320 13:52:35.226784 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fedca4-15c2-4975-807e-e0c9ded7f329" path="/var/lib/kubelet/pods/19fedca4-15c2-4975-807e-e0c9ded7f329/volumes" Mar 20 13:52:44 crc kubenswrapper[4895]: I0320 13:52:44.212193 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:52:44 crc kubenswrapper[4895]: E0320 13:52:44.212945 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:52:59 crc kubenswrapper[4895]: I0320 13:52:59.212620 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:52:59 crc kubenswrapper[4895]: E0320 13:52:59.214039 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:53:08 crc kubenswrapper[4895]: I0320 13:53:08.597851 4895 generic.go:334] "Generic (PLEG): container finished" podID="f741586e-ce78-4057-8e0c-032310d4e3a4" containerID="3d5de3c2aaad2d327aeadbc88d15be08b559cb61fbe64764bd3d8c18af5785bc" exitCode=0 Mar 20 13:53:08 crc kubenswrapper[4895]: I0320 13:53:08.597982 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" event={"ID":"f741586e-ce78-4057-8e0c-032310d4e3a4","Type":"ContainerDied","Data":"3d5de3c2aaad2d327aeadbc88d15be08b559cb61fbe64764bd3d8c18af5785bc"} Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.559191 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.623353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" event={"ID":"f741586e-ce78-4057-8e0c-032310d4e3a4","Type":"ContainerDied","Data":"c2455c6a87c44131445cf07fc0e73e93474768f682a0563eec55f8cdc92c2314"} Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.623389 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2455c6a87c44131445cf07fc0e73e93474768f682a0563eec55f8cdc92c2314" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.623477 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.641247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam\") pod \"f741586e-ce78-4057-8e0c-032310d4e3a4\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.641353 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46jq\" (UniqueName: \"kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq\") pod \"f741586e-ce78-4057-8e0c-032310d4e3a4\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.641420 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory\") pod \"f741586e-ce78-4057-8e0c-032310d4e3a4\" (UID: \"f741586e-ce78-4057-8e0c-032310d4e3a4\") " Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.650572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq" (OuterVolumeSpecName: "kube-api-access-z46jq") pod "f741586e-ce78-4057-8e0c-032310d4e3a4" (UID: "f741586e-ce78-4057-8e0c-032310d4e3a4"). InnerVolumeSpecName "kube-api-access-z46jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.693156 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory" (OuterVolumeSpecName: "inventory") pod "f741586e-ce78-4057-8e0c-032310d4e3a4" (UID: "f741586e-ce78-4057-8e0c-032310d4e3a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.699980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f741586e-ce78-4057-8e0c-032310d4e3a4" (UID: "f741586e-ce78-4057-8e0c-032310d4e3a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.733282 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq"] Mar 20 13:53:10 crc kubenswrapper[4895]: E0320 13:53:10.734207 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f741586e-ce78-4057-8e0c-032310d4e3a4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.734236 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f741586e-ce78-4057-8e0c-032310d4e3a4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:10 crc kubenswrapper[4895]: E0320 13:53:10.734281 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a03ef8-3463-4c6f-9415-37069d4bcbc9" containerName="oc" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.734291 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a03ef8-3463-4c6f-9415-37069d4bcbc9" containerName="oc" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.734551 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f741586e-ce78-4057-8e0c-032310d4e3a4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.734674 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a03ef8-3463-4c6f-9415-37069d4bcbc9" containerName="oc" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.738130 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.747883 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq"] Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dsx2\" (UniqueName: \"kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749530 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749757 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749784 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46jq\" (UniqueName: \"kubernetes.io/projected/f741586e-ce78-4057-8e0c-032310d4e3a4-kube-api-access-z46jq\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.749797 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741586e-ce78-4057-8e0c-032310d4e3a4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.852199 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dsx2\" (UniqueName: \"kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.852278 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.852515 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.860458 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.864680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:10 crc kubenswrapper[4895]: I0320 13:53:10.868456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dsx2\" (UniqueName: \"kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:11 crc kubenswrapper[4895]: I0320 13:53:11.084801 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:11 crc kubenswrapper[4895]: I0320 13:53:11.613378 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq"] Mar 20 13:53:11 crc kubenswrapper[4895]: I0320 13:53:11.635832 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" event={"ID":"7d36a84a-b329-40b5-8da0-4a01ff417cc4","Type":"ContainerStarted","Data":"34edcd806c2bc4abd9d1c7d437ba646965bbe07e716a4ec8e6f63d7fac69ab94"} Mar 20 13:53:12 crc kubenswrapper[4895]: I0320 13:53:12.218053 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:53:12 crc kubenswrapper[4895]: E0320 13:53:12.219086 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:53:12 crc kubenswrapper[4895]: I0320 13:53:12.647756 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" event={"ID":"7d36a84a-b329-40b5-8da0-4a01ff417cc4","Type":"ContainerStarted","Data":"37187362f499cdafc3895363b65a8573a87c07c8e293ec17c08c8ce589a3ae21"} Mar 20 13:53:12 crc kubenswrapper[4895]: I0320 13:53:12.677827 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" podStartSLOduration=2.165607074 podStartE2EDuration="2.67780977s" podCreationTimestamp="2026-03-20 13:53:10 +0000 UTC" firstStartedPulling="2026-03-20 13:53:11.623686765 +0000 UTC m=+1891.133405751" lastFinishedPulling="2026-03-20 13:53:12.135889481 +0000 UTC m=+1891.645608447" observedRunningTime="2026-03-20 13:53:12.666868471 +0000 UTC m=+1892.176587437" watchObservedRunningTime="2026-03-20 13:53:12.67780977 +0000 UTC m=+1892.187528736" Mar 20 13:53:16 crc kubenswrapper[4895]: I0320 13:53:16.690973 4895 generic.go:334] "Generic (PLEG): container finished" podID="7d36a84a-b329-40b5-8da0-4a01ff417cc4" containerID="37187362f499cdafc3895363b65a8573a87c07c8e293ec17c08c8ce589a3ae21" exitCode=0 Mar 20 13:53:16 crc kubenswrapper[4895]: I0320 13:53:16.691073 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" event={"ID":"7d36a84a-b329-40b5-8da0-4a01ff417cc4","Type":"ContainerDied","Data":"37187362f499cdafc3895363b65a8573a87c07c8e293ec17c08c8ce589a3ae21"} Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.001432 4895 scope.go:117] "RemoveContainer" containerID="b5d8f9cfc2e261233c9bfc1cef442b5b2237ecd1593504dbe4c3a97cb2eae86e" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.039209 4895 scope.go:117] "RemoveContainer" containerID="0a4a6f07e6900137b6e30afe86a56ebdb7db9c6e4fbd5df523646bb2ba158250" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.092711 4895 scope.go:117] "RemoveContainer" containerID="928b408aa9ce905eb0fdab7c5bd8f5c9ca76daf917c55d5268da6de23f94d10e" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.197555 4895 scope.go:117] "RemoveContainer" containerID="7700f88e906f3ddddc252e5432352b1ecba48f2f2d6ad50e676b382bfd2848d8" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.561576 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.709814 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" event={"ID":"7d36a84a-b329-40b5-8da0-4a01ff417cc4","Type":"ContainerDied","Data":"34edcd806c2bc4abd9d1c7d437ba646965bbe07e716a4ec8e6f63d7fac69ab94"} Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.709854 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34edcd806c2bc4abd9d1c7d437ba646965bbe07e716a4ec8e6f63d7fac69ab94" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.709882 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.738802 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam\") pod \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.738975 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dsx2\" (UniqueName: \"kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2\") pod \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.739012 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory\") pod \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\" (UID: \"7d36a84a-b329-40b5-8da0-4a01ff417cc4\") " Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.747432 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2" (OuterVolumeSpecName: "kube-api-access-8dsx2") pod "7d36a84a-b329-40b5-8da0-4a01ff417cc4" (UID: "7d36a84a-b329-40b5-8da0-4a01ff417cc4"). InnerVolumeSpecName "kube-api-access-8dsx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.780956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory" (OuterVolumeSpecName: "inventory") pod "7d36a84a-b329-40b5-8da0-4a01ff417cc4" (UID: "7d36a84a-b329-40b5-8da0-4a01ff417cc4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.792171 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4"] Mar 20 13:53:18 crc kubenswrapper[4895]: E0320 13:53:18.792714 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d36a84a-b329-40b5-8da0-4a01ff417cc4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.792746 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d36a84a-b329-40b5-8da0-4a01ff417cc4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.792993 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d36a84a-b329-40b5-8da0-4a01ff417cc4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.794079 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.808581 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d36a84a-b329-40b5-8da0-4a01ff417cc4" (UID: "7d36a84a-b329-40b5-8da0-4a01ff417cc4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.816581 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4"] Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.841984 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dsx2\" (UniqueName: \"kubernetes.io/projected/7d36a84a-b329-40b5-8da0-4a01ff417cc4-kube-api-access-8dsx2\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.842027 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.842041 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d36a84a-b329-40b5-8da0-4a01ff417cc4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.943587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rjdz\" (UniqueName: \"kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.943670 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:18 crc kubenswrapper[4895]: I0320 13:53:18.944091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.046855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rjdz\" (UniqueName: \"kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.046934 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.047050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.051517 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.052130 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.063900 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rjdz\" (UniqueName: \"kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wkfk4\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.117434 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.661900 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4"] Mar 20 13:53:19 crc kubenswrapper[4895]: I0320 13:53:19.720115 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" event={"ID":"19650c9a-aeda-44ce-9793-a3b03e1d361d","Type":"ContainerStarted","Data":"5337d007ac3ad3d8fc9a6b481b46cd286939a1deb30c4f72b3b102d882a6ceeb"} Mar 20 13:53:20 crc kubenswrapper[4895]: I0320 13:53:20.733615 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" event={"ID":"19650c9a-aeda-44ce-9793-a3b03e1d361d","Type":"ContainerStarted","Data":"dc27308d201e386c96bb3c7dd6f058f0fed28f4c70c3349a4246cea755cbe5a6"} Mar 20 13:53:20 crc kubenswrapper[4895]: I0320 13:53:20.758699 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" podStartSLOduration=2.323012242 podStartE2EDuration="2.758679685s" podCreationTimestamp="2026-03-20 13:53:18 +0000 UTC" firstStartedPulling="2026-03-20 13:53:19.671414723 +0000 UTC m=+1899.181133689" lastFinishedPulling="2026-03-20 13:53:20.107082166 +0000 UTC m=+1899.616801132" observedRunningTime="2026-03-20 13:53:20.754382849 +0000 UTC m=+1900.264101825" watchObservedRunningTime="2026-03-20 13:53:20.758679685 +0000 UTC m=+1900.268398651" Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.065531 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-trcpf"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.076114 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-568f-account-create-update-cbjxb"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.085307 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-trcpf"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.110339 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rg8bj"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.122595 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-568f-account-create-update-cbjxb"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.134160 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rg8bj"] Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.222895 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228bc5a9-17f9-4434-aca9-685d14b15c62" path="/var/lib/kubelet/pods/228bc5a9-17f9-4434-aca9-685d14b15c62/volumes" Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.223554 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a123ea-8352-432e-af35-f2b275d1dbbb" path="/var/lib/kubelet/pods/95a123ea-8352-432e-af35-f2b275d1dbbb/volumes" Mar 20 13:53:21 crc kubenswrapper[4895]: I0320 13:53:21.224122 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1128d7a-df8d-4255-af0a-4ed8058e8fa4" path="/var/lib/kubelet/pods/f1128d7a-df8d-4255-af0a-4ed8058e8fa4/volumes" Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.045364 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-67bsv"] Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.063253 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-67bsv"] Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.071192 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-571b-account-create-update-5b4mj"] Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.082148 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8bea-account-create-update-xt24z"] Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.089156 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8bea-account-create-update-xt24z"] Mar 20 13:53:22 crc kubenswrapper[4895]: I0320 13:53:22.097503 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-571b-account-create-update-5b4mj"] Mar 20 13:53:23 crc kubenswrapper[4895]: I0320 13:53:23.230911 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b287388-cee5-4065-9b38-56633ce573c2" path="/var/lib/kubelet/pods/4b287388-cee5-4065-9b38-56633ce573c2/volumes" Mar 20 13:53:23 crc kubenswrapper[4895]: I0320 13:53:23.233160 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66fce8ef-22cc-4aa4-977b-8cf7382053d5" path="/var/lib/kubelet/pods/66fce8ef-22cc-4aa4-977b-8cf7382053d5/volumes" Mar 20 13:53:23 crc kubenswrapper[4895]: I0320 13:53:23.235702 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795dcfb9-6fb6-42f6-adbf-4e77aef2bd90" path="/var/lib/kubelet/pods/795dcfb9-6fb6-42f6-adbf-4e77aef2bd90/volumes" Mar 20 13:53:25 crc kubenswrapper[4895]: I0320 13:53:25.213261 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:53:25 crc kubenswrapper[4895]: I0320 13:53:25.819992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569"} Mar 20 13:53:55 crc kubenswrapper[4895]: I0320 13:53:55.150017 4895 generic.go:334] "Generic (PLEG): container finished" podID="19650c9a-aeda-44ce-9793-a3b03e1d361d" containerID="dc27308d201e386c96bb3c7dd6f058f0fed28f4c70c3349a4246cea755cbe5a6" exitCode=0 Mar 20 13:53:55 crc kubenswrapper[4895]: I0320 13:53:55.150080 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" event={"ID":"19650c9a-aeda-44ce-9793-a3b03e1d361d","Type":"ContainerDied","Data":"dc27308d201e386c96bb3c7dd6f058f0fed28f4c70c3349a4246cea755cbe5a6"} Mar 20 13:53:56 crc kubenswrapper[4895]: I0320 13:53:56.975554 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.072869 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory\") pod \"19650c9a-aeda-44ce-9793-a3b03e1d361d\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.073130 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rjdz\" (UniqueName: \"kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz\") pod \"19650c9a-aeda-44ce-9793-a3b03e1d361d\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.073306 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam\") pod \"19650c9a-aeda-44ce-9793-a3b03e1d361d\" (UID: \"19650c9a-aeda-44ce-9793-a3b03e1d361d\") " Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.079409 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz" (OuterVolumeSpecName: "kube-api-access-2rjdz") pod "19650c9a-aeda-44ce-9793-a3b03e1d361d" (UID: "19650c9a-aeda-44ce-9793-a3b03e1d361d"). InnerVolumeSpecName "kube-api-access-2rjdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.104196 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "19650c9a-aeda-44ce-9793-a3b03e1d361d" (UID: "19650c9a-aeda-44ce-9793-a3b03e1d361d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.105106 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory" (OuterVolumeSpecName: "inventory") pod "19650c9a-aeda-44ce-9793-a3b03e1d361d" (UID: "19650c9a-aeda-44ce-9793-a3b03e1d361d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.170823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" event={"ID":"19650c9a-aeda-44ce-9793-a3b03e1d361d","Type":"ContainerDied","Data":"5337d007ac3ad3d8fc9a6b481b46cd286939a1deb30c4f72b3b102d882a6ceeb"} Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.170877 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5337d007ac3ad3d8fc9a6b481b46cd286939a1deb30c4f72b3b102d882a6ceeb" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.170876 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wkfk4" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.176709 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.176769 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rjdz\" (UniqueName: \"kubernetes.io/projected/19650c9a-aeda-44ce-9793-a3b03e1d361d-kube-api-access-2rjdz\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.176792 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19650c9a-aeda-44ce-9793-a3b03e1d361d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.296553 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm"] Mar 20 13:53:57 crc kubenswrapper[4895]: E0320 13:53:57.297210 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19650c9a-aeda-44ce-9793-a3b03e1d361d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.297236 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="19650c9a-aeda-44ce-9793-a3b03e1d361d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.297501 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="19650c9a-aeda-44ce-9793-a3b03e1d361d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.298511 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.302757 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.302984 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.303367 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.304219 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.320179 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm"] Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.380566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.380686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.380771 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nwv7\" (UniqueName: \"kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.482864 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nwv7\" (UniqueName: \"kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.482977 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.483054 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.486928 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.486942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.501680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nwv7\" (UniqueName: \"kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:57 crc kubenswrapper[4895]: I0320 13:53:57.615193 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:53:58 crc kubenswrapper[4895]: I0320 13:53:58.172752 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm"] Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.038975 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-68krz"] Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.050047 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-68krz"] Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.205517 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" event={"ID":"77679ba3-7833-453d-b008-536582648587","Type":"ContainerStarted","Data":"bf0216e88361ffb9746d8084a89b19e1296a5918220f1a1a1b74e9a3c414425f"} Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.205573 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" event={"ID":"77679ba3-7833-453d-b008-536582648587","Type":"ContainerStarted","Data":"eb230171a6039f8944581ffbb72e9c86b6e3de79902ebd6ea9bf94b83d3ad081"} Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.227035 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" podStartSLOduration=1.796893726 podStartE2EDuration="2.227012473s" podCreationTimestamp="2026-03-20 13:53:57 +0000 UTC" firstStartedPulling="2026-03-20 13:53:58.183775876 +0000 UTC m=+1937.693494842" lastFinishedPulling="2026-03-20 13:53:58.613894623 +0000 UTC m=+1938.123613589" observedRunningTime="2026-03-20 13:53:59.217741316 +0000 UTC m=+1938.727460282" watchObservedRunningTime="2026-03-20 13:53:59.227012473 +0000 UTC m=+1938.736731449" Mar 20 13:53:59 crc kubenswrapper[4895]: I0320 13:53:59.228525 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b0f7494-037c-462e-bd52-4a4d2469c62d" path="/var/lib/kubelet/pods/2b0f7494-037c-462e-bd52-4a4d2469c62d/volumes" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.141176 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-trnlb"] Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.143116 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.145185 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.146436 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.149929 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.161992 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-trnlb"] Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.241160 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwxh\" (UniqueName: \"kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh\") pod \"auto-csr-approver-29566914-trnlb\" (UID: \"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e\") " pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.343360 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwxh\" (UniqueName: \"kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh\") pod \"auto-csr-approver-29566914-trnlb\" (UID: \"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e\") " pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.363346 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwxh\" (UniqueName: \"kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh\") pod \"auto-csr-approver-29566914-trnlb\" (UID: \"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e\") " pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.474923 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:00 crc kubenswrapper[4895]: I0320 13:54:00.961886 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-trnlb"] Mar 20 13:54:00 crc kubenswrapper[4895]: W0320 13:54:00.963548 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25861fd5_ca0e_4822_9cc7_e8b3e53b5d4e.slice/crio-eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21 WatchSource:0}: Error finding container eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21: Status 404 returned error can't find the container with id eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21 Mar 20 13:54:01 crc kubenswrapper[4895]: I0320 13:54:01.236335 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-trnlb" event={"ID":"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e","Type":"ContainerStarted","Data":"eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21"} Mar 20 13:54:02 crc kubenswrapper[4895]: I0320 13:54:02.248907 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-trnlb" event={"ID":"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e","Type":"ContainerStarted","Data":"99176121a628006e8cd09e7d6a983e4142f60503ab30551b920724996ab58187"} Mar 20 13:54:02 crc kubenswrapper[4895]: I0320 13:54:02.265557 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566914-trnlb" podStartSLOduration=1.33390645 podStartE2EDuration="2.265539221s" podCreationTimestamp="2026-03-20 13:54:00 +0000 UTC" firstStartedPulling="2026-03-20 13:54:00.965589605 +0000 UTC m=+1940.475308561" lastFinishedPulling="2026-03-20 13:54:01.897222346 +0000 UTC m=+1941.406941332" observedRunningTime="2026-03-20 13:54:02.260772303 +0000 UTC m=+1941.770491269" watchObservedRunningTime="2026-03-20 13:54:02.265539221 +0000 UTC m=+1941.775258187" Mar 20 13:54:03 crc kubenswrapper[4895]: I0320 13:54:03.262387 4895 generic.go:334] "Generic (PLEG): container finished" podID="25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" containerID="99176121a628006e8cd09e7d6a983e4142f60503ab30551b920724996ab58187" exitCode=0 Mar 20 13:54:03 crc kubenswrapper[4895]: I0320 13:54:03.262488 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-trnlb" event={"ID":"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e","Type":"ContainerDied","Data":"99176121a628006e8cd09e7d6a983e4142f60503ab30551b920724996ab58187"} Mar 20 13:54:04 crc kubenswrapper[4895]: I0320 13:54:04.907811 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:04 crc kubenswrapper[4895]: I0320 13:54:04.946166 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nwxh\" (UniqueName: \"kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh\") pod \"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e\" (UID: \"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e\") " Mar 20 13:54:04 crc kubenswrapper[4895]: I0320 13:54:04.959230 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh" (OuterVolumeSpecName: "kube-api-access-6nwxh") pod "25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" (UID: "25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e"). InnerVolumeSpecName "kube-api-access-6nwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.050600 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nwxh\" (UniqueName: \"kubernetes.io/projected/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e-kube-api-access-6nwxh\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.289813 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-trnlb" event={"ID":"25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e","Type":"ContainerDied","Data":"eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21"} Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.289850 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf0b8e831a48606fd5ae099197c8a24ec07bb1a51fa973b852edcb4f2f65c21" Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.289903 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-trnlb" Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.341276 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-h8mbp"] Mar 20 13:54:05 crc kubenswrapper[4895]: I0320 13:54:05.352749 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-h8mbp"] Mar 20 13:54:07 crc kubenswrapper[4895]: I0320 13:54:07.239080 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749126f3-49e7-49b4-b8b5-b8a853df2990" path="/var/lib/kubelet/pods/749126f3-49e7-49b4-b8b5-b8a853df2990/volumes" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.320993 4895 scope.go:117] "RemoveContainer" containerID="f2e822bad642583c2e2023cb57578e9ac5b7b26c1ccf56b2be7d83cbf0ae5bc8" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.411154 4895 scope.go:117] "RemoveContainer" containerID="b3dfb5003971363de16e8c4e190acfd80815d27ea5d973767348b69576c43919" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.451191 4895 scope.go:117] "RemoveContainer" containerID="0dbc1613fa99afe295156edac6f37b60876898bbb5cdcc85937fc7b877a669a4" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.513436 4895 scope.go:117] "RemoveContainer" containerID="66ba7de2466ae09e820baab9478a0a185a4d58d159afdaa143a532cdb5c982b5" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.569858 4895 scope.go:117] "RemoveContainer" containerID="2ca0045f3b36bbb500f5c3da216cce47c61028ede373a9f162d0d442b21ec387" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.605958 4895 scope.go:117] "RemoveContainer" containerID="19fd0a9821c640ce869017aa8d909f86de2f344e1f1c8ba94cde5d7fd88f7b3d" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.652714 4895 scope.go:117] "RemoveContainer" containerID="223a4adb81555c05ddc3c13185382400e24a7c287c4308a1a9a312a683b54223" Mar 20 13:54:18 crc kubenswrapper[4895]: I0320 13:54:18.676306 4895 scope.go:117] "RemoveContainer" containerID="32242598d8cc028cb5b3a1aeb42dd47a8860294cb002fbfabd43c40c388f646a" Mar 20 13:54:25 crc kubenswrapper[4895]: I0320 13:54:25.059860 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hnr4b"] Mar 20 13:54:25 crc kubenswrapper[4895]: I0320 13:54:25.072876 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hnr4b"] Mar 20 13:54:25 crc kubenswrapper[4895]: I0320 13:54:25.229346 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cb01ad-b24f-4840-b7d8-6118730ac633" path="/var/lib/kubelet/pods/92cb01ad-b24f-4840-b7d8-6118730ac633/volumes" Mar 20 13:54:30 crc kubenswrapper[4895]: I0320 13:54:30.031442 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6sw4l"] Mar 20 13:54:30 crc kubenswrapper[4895]: I0320 13:54:30.045266 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6sw4l"] Mar 20 13:54:31 crc kubenswrapper[4895]: I0320 13:54:31.233617 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e0421a-d787-4327-b7bf-b4f974871690" path="/var/lib/kubelet/pods/23e0421a-d787-4327-b7bf-b4f974871690/volumes" Mar 20 13:54:42 crc kubenswrapper[4895]: I0320 13:54:42.725686 4895 generic.go:334] "Generic (PLEG): container finished" podID="77679ba3-7833-453d-b008-536582648587" containerID="bf0216e88361ffb9746d8084a89b19e1296a5918220f1a1a1b74e9a3c414425f" exitCode=0 Mar 20 13:54:42 crc kubenswrapper[4895]: I0320 13:54:42.725794 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" event={"ID":"77679ba3-7833-453d-b008-536582648587","Type":"ContainerDied","Data":"bf0216e88361ffb9746d8084a89b19e1296a5918220f1a1a1b74e9a3c414425f"} Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.266596 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.437762 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory\") pod \"77679ba3-7833-453d-b008-536582648587\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.438263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam\") pod \"77679ba3-7833-453d-b008-536582648587\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.438359 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nwv7\" (UniqueName: \"kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7\") pod \"77679ba3-7833-453d-b008-536582648587\" (UID: \"77679ba3-7833-453d-b008-536582648587\") " Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.444734 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7" (OuterVolumeSpecName: "kube-api-access-5nwv7") pod "77679ba3-7833-453d-b008-536582648587" (UID: "77679ba3-7833-453d-b008-536582648587"). InnerVolumeSpecName "kube-api-access-5nwv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.468690 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory" (OuterVolumeSpecName: "inventory") pod "77679ba3-7833-453d-b008-536582648587" (UID: "77679ba3-7833-453d-b008-536582648587"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.472640 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77679ba3-7833-453d-b008-536582648587" (UID: "77679ba3-7833-453d-b008-536582648587"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.541133 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.541480 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77679ba3-7833-453d-b008-536582648587-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.541500 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nwv7\" (UniqueName: \"kubernetes.io/projected/77679ba3-7833-453d-b008-536582648587-kube-api-access-5nwv7\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.747310 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" event={"ID":"77679ba3-7833-453d-b008-536582648587","Type":"ContainerDied","Data":"eb230171a6039f8944581ffbb72e9c86b6e3de79902ebd6ea9bf94b83d3ad081"} Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.747357 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb230171a6039f8944581ffbb72e9c86b6e3de79902ebd6ea9bf94b83d3ad081" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.747366 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.904883 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhd4n"] Mar 20 13:54:44 crc kubenswrapper[4895]: E0320 13:54:44.905628 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" containerName="oc" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.905662 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" containerName="oc" Mar 20 13:54:44 crc kubenswrapper[4895]: E0320 13:54:44.905703 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77679ba3-7833-453d-b008-536582648587" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.905717 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="77679ba3-7833-453d-b008-536582648587" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.906074 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" containerName="oc" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.906132 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="77679ba3-7833-453d-b008-536582648587" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.907340 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.909624 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.910001 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.911740 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.916304 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:54:44 crc kubenswrapper[4895]: I0320 13:54:44.939525 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhd4n"] Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.051075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84v2\" (UniqueName: \"kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.051280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.051750 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.154380 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.154751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.154979 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h84v2\" (UniqueName: \"kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.159424 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.160130 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.177875 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84v2\" (UniqueName: \"kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2\") pod \"ssh-known-hosts-edpm-deployment-jhd4n\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.227962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:45 crc kubenswrapper[4895]: I0320 13:54:45.836774 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhd4n"] Mar 20 13:54:45 crc kubenswrapper[4895]: W0320 13:54:45.839070 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd18f4c75_cf01_4b82_844a_f24b83ddfb7a.slice/crio-cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879 WatchSource:0}: Error finding container cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879: Status 404 returned error can't find the container with id cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879 Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.770268 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" event={"ID":"d18f4c75-cf01-4b82-844a-f24b83ddfb7a","Type":"ContainerStarted","Data":"2dd1dbf4e593280036f0238a36642f8bf353e69649fb65d8ff5ff5d3f5bc1502"} Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.771689 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" event={"ID":"d18f4c75-cf01-4b82-844a-f24b83ddfb7a","Type":"ContainerStarted","Data":"cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879"} Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.788519 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.791165 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.801278 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.805679 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" podStartSLOduration=2.3397508399999998 podStartE2EDuration="2.805656945s" podCreationTimestamp="2026-03-20 13:54:44 +0000 UTC" firstStartedPulling="2026-03-20 13:54:45.843051631 +0000 UTC m=+1985.352770597" lastFinishedPulling="2026-03-20 13:54:46.308957736 +0000 UTC m=+1985.818676702" observedRunningTime="2026-03-20 13:54:46.790501994 +0000 UTC m=+1986.300220980" watchObservedRunningTime="2026-03-20 13:54:46.805656945 +0000 UTC m=+1986.315375921" Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.892535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkck\" (UniqueName: \"kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.892625 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.892664 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.967118 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:54:46 crc kubenswrapper[4895]: I0320 13:54:46.968955 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:46.995664 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:46.995736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:46.995992 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkck\" (UniqueName: \"kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:46.997055 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:46.997461 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.029156 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkck\" (UniqueName: \"kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck\") pod \"redhat-operators-c82f7\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.036150 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.098160 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcwm9\" (UniqueName: \"kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.098451 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.098634 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.112494 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.200711 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.200880 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.201230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.201364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.201663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcwm9\" (UniqueName: \"kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.227851 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcwm9\" (UniqueName: \"kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9\") pod \"certified-operators-qn2cp\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.295532 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.664925 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:54:47 crc kubenswrapper[4895]: W0320 13:54:47.673363 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3258d433_c990_40f9_9b48_729c3bc7ad30.slice/crio-a9a50b024036f6d41c7928312ed88d3e9fd8c608a416f570429295475a8dcc2f WatchSource:0}: Error finding container a9a50b024036f6d41c7928312ed88d3e9fd8c608a416f570429295475a8dcc2f: Status 404 returned error can't find the container with id a9a50b024036f6d41c7928312ed88d3e9fd8c608a416f570429295475a8dcc2f Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.784182 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerStarted","Data":"a9a50b024036f6d41c7928312ed88d3e9fd8c608a416f570429295475a8dcc2f"} Mar 20 13:54:47 crc kubenswrapper[4895]: I0320 13:54:47.862080 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:54:48 crc kubenswrapper[4895]: I0320 13:54:48.795595 4895 generic.go:334] "Generic (PLEG): container finished" podID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerID="8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c" exitCode=0 Mar 20 13:54:48 crc kubenswrapper[4895]: I0320 13:54:48.795645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerDied","Data":"8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c"} Mar 20 13:54:48 crc kubenswrapper[4895]: I0320 13:54:48.796005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerStarted","Data":"eacf7e174e0cb626aa3dc28a535fe446aabbf3f3d42b63c1ac599754831d7059"} Mar 20 13:54:48 crc kubenswrapper[4895]: I0320 13:54:48.802611 4895 generic.go:334] "Generic (PLEG): container finished" podID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerID="4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f" exitCode=0 Mar 20 13:54:48 crc kubenswrapper[4895]: I0320 13:54:48.802651 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerDied","Data":"4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f"} Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.388902 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.392097 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.403354 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.556278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.556328 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw28q\" (UniqueName: \"kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.556380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.658537 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.658677 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw28q\" (UniqueName: \"kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.658840 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.658848 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.659402 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.679249 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw28q\" (UniqueName: \"kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q\") pod \"community-operators-w4gk2\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.725437 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.818966 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerStarted","Data":"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0"} Mar 20 13:54:49 crc kubenswrapper[4895]: I0320 13:54:49.826646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerStarted","Data":"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc"} Mar 20 13:54:50 crc kubenswrapper[4895]: I0320 13:54:50.267575 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:54:50 crc kubenswrapper[4895]: I0320 13:54:50.836884 4895 generic.go:334] "Generic (PLEG): container finished" podID="53f9655e-3d95-404c-9699-a1df11b6197e" containerID="47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba" exitCode=0 Mar 20 13:54:50 crc kubenswrapper[4895]: I0320 13:54:50.836981 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerDied","Data":"47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba"} Mar 20 13:54:50 crc kubenswrapper[4895]: I0320 13:54:50.837025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerStarted","Data":"72ccd20a04ac97ab762c7d3181ed3cda0ec6726156301acd6ae97c020f512f19"} Mar 20 13:54:52 crc kubenswrapper[4895]: I0320 13:54:52.859702 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerStarted","Data":"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e"} Mar 20 13:54:52 crc kubenswrapper[4895]: I0320 13:54:52.862970 4895 generic.go:334] "Generic (PLEG): container finished" podID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerID="06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0" exitCode=0 Mar 20 13:54:52 crc kubenswrapper[4895]: I0320 13:54:52.863029 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerDied","Data":"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0"} Mar 20 13:54:53 crc kubenswrapper[4895]: I0320 13:54:53.892488 4895 generic.go:334] "Generic (PLEG): container finished" podID="d18f4c75-cf01-4b82-844a-f24b83ddfb7a" containerID="2dd1dbf4e593280036f0238a36642f8bf353e69649fb65d8ff5ff5d3f5bc1502" exitCode=0 Mar 20 13:54:53 crc kubenswrapper[4895]: I0320 13:54:53.893087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" event={"ID":"d18f4c75-cf01-4b82-844a-f24b83ddfb7a","Type":"ContainerDied","Data":"2dd1dbf4e593280036f0238a36642f8bf353e69649fb65d8ff5ff5d3f5bc1502"} Mar 20 13:54:53 crc kubenswrapper[4895]: I0320 13:54:53.904674 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerStarted","Data":"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab"} Mar 20 13:54:53 crc kubenswrapper[4895]: I0320 13:54:53.970786 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qn2cp" podStartSLOduration=3.331994889 podStartE2EDuration="7.970764895s" podCreationTimestamp="2026-03-20 13:54:46 +0000 UTC" firstStartedPulling="2026-03-20 13:54:48.798887968 +0000 UTC m=+1988.308606934" lastFinishedPulling="2026-03-20 13:54:53.437657974 +0000 UTC m=+1992.947376940" observedRunningTime="2026-03-20 13:54:53.955720956 +0000 UTC m=+1993.465439922" watchObservedRunningTime="2026-03-20 13:54:53.970764895 +0000 UTC m=+1993.480483861" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.448994 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.589723 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam\") pod \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.589874 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h84v2\" (UniqueName: \"kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2\") pod \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.590072 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0\") pod \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\" (UID: \"d18f4c75-cf01-4b82-844a-f24b83ddfb7a\") " Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.598993 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2" (OuterVolumeSpecName: "kube-api-access-h84v2") pod "d18f4c75-cf01-4b82-844a-f24b83ddfb7a" (UID: "d18f4c75-cf01-4b82-844a-f24b83ddfb7a"). InnerVolumeSpecName "kube-api-access-h84v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.623036 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d18f4c75-cf01-4b82-844a-f24b83ddfb7a" (UID: "d18f4c75-cf01-4b82-844a-f24b83ddfb7a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.634631 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d18f4c75-cf01-4b82-844a-f24b83ddfb7a" (UID: "d18f4c75-cf01-4b82-844a-f24b83ddfb7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.692215 4895 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.692251 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.692264 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h84v2\" (UniqueName: \"kubernetes.io/projected/d18f4c75-cf01-4b82-844a-f24b83ddfb7a-kube-api-access-h84v2\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.924033 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" event={"ID":"d18f4c75-cf01-4b82-844a-f24b83ddfb7a","Type":"ContainerDied","Data":"cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879"} Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.924080 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5c60d1d05970b8e34e1ced9b27ab7e54f45e921b0dcaf46c277fe2c3ece879" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.924059 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhd4n" Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.926693 4895 generic.go:334] "Generic (PLEG): container finished" podID="53f9655e-3d95-404c-9699-a1df11b6197e" containerID="6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e" exitCode=0 Mar 20 13:54:55 crc kubenswrapper[4895]: I0320 13:54:55.926735 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerDied","Data":"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e"} Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.037308 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl"] Mar 20 13:54:56 crc kubenswrapper[4895]: E0320 13:54:56.038850 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18f4c75-cf01-4b82-844a-f24b83ddfb7a" containerName="ssh-known-hosts-edpm-deployment" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.038873 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18f4c75-cf01-4b82-844a-f24b83ddfb7a" containerName="ssh-known-hosts-edpm-deployment" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.039179 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18f4c75-cf01-4b82-844a-f24b83ddfb7a" containerName="ssh-known-hosts-edpm-deployment" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.040117 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.045517 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.045969 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.046036 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.045972 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.061445 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl"] Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.205976 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.206170 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77z5n\" (UniqueName: \"kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.206212 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.308323 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77z5n\" (UniqueName: \"kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.308561 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.308781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.314440 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.314460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.330109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77z5n\" (UniqueName: \"kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-txfsl\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.362160 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.924296 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl"] Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.940584 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerStarted","Data":"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929"} Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.942954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" event={"ID":"9fde50f5-9c7a-4737-9d42-f6df58df9629","Type":"ContainerStarted","Data":"96ebd696e2463b29d15b06cea28b76beef40e61851efec017f61483a9a1c7411"} Mar 20 13:54:56 crc kubenswrapper[4895]: I0320 13:54:56.967990 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w4gk2" podStartSLOduration=2.20797955 podStartE2EDuration="7.967970206s" podCreationTimestamp="2026-03-20 13:54:49 +0000 UTC" firstStartedPulling="2026-03-20 13:54:50.838664861 +0000 UTC m=+1990.348383827" lastFinishedPulling="2026-03-20 13:54:56.598655507 +0000 UTC m=+1996.108374483" observedRunningTime="2026-03-20 13:54:56.957168591 +0000 UTC m=+1996.466887557" watchObservedRunningTime="2026-03-20 13:54:56.967970206 +0000 UTC m=+1996.477689172" Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.295992 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.296242 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.956426 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" event={"ID":"9fde50f5-9c7a-4737-9d42-f6df58df9629","Type":"ContainerStarted","Data":"eba882b27b6826dc0e3be6e7170b5bc4d8c05df94c7ad8d6427b709294aa501c"} Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.962895 4895 generic.go:334] "Generic (PLEG): container finished" podID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerID="aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc" exitCode=0 Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.962945 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerDied","Data":"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc"} Mar 20 13:54:57 crc kubenswrapper[4895]: I0320 13:54:57.990711 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" podStartSLOduration=1.592855804 podStartE2EDuration="1.990690041s" podCreationTimestamp="2026-03-20 13:54:56 +0000 UTC" firstStartedPulling="2026-03-20 13:54:56.930751393 +0000 UTC m=+1996.440470379" lastFinishedPulling="2026-03-20 13:54:57.32858565 +0000 UTC m=+1996.838304616" observedRunningTime="2026-03-20 13:54:57.982793848 +0000 UTC m=+1997.492512834" watchObservedRunningTime="2026-03-20 13:54:57.990690041 +0000 UTC m=+1997.500409007" Mar 20 13:54:58 crc kubenswrapper[4895]: I0320 13:54:58.341482 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qn2cp" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" probeResult="failure" output=< Mar 20 13:54:58 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:54:58 crc kubenswrapper[4895]: > Mar 20 13:54:58 crc kubenswrapper[4895]: I0320 13:54:58.979491 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerStarted","Data":"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285"} Mar 20 13:54:59 crc kubenswrapper[4895]: I0320 13:54:59.009056 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c82f7" podStartSLOduration=3.312248915 podStartE2EDuration="13.009036561s" podCreationTimestamp="2026-03-20 13:54:46 +0000 UTC" firstStartedPulling="2026-03-20 13:54:48.804114926 +0000 UTC m=+1988.313833892" lastFinishedPulling="2026-03-20 13:54:58.500902562 +0000 UTC m=+1998.010621538" observedRunningTime="2026-03-20 13:54:59.00493486 +0000 UTC m=+1998.514653826" watchObservedRunningTime="2026-03-20 13:54:59.009036561 +0000 UTC m=+1998.518755527" Mar 20 13:54:59 crc kubenswrapper[4895]: I0320 13:54:59.726056 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:54:59 crc kubenswrapper[4895]: I0320 13:54:59.726453 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:55:00 crc kubenswrapper[4895]: I0320 13:55:00.776429 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-w4gk2" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:00 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:00 crc kubenswrapper[4895]: > Mar 20 13:55:05 crc kubenswrapper[4895]: I0320 13:55:05.045200 4895 generic.go:334] "Generic (PLEG): container finished" podID="9fde50f5-9c7a-4737-9d42-f6df58df9629" containerID="eba882b27b6826dc0e3be6e7170b5bc4d8c05df94c7ad8d6427b709294aa501c" exitCode=0 Mar 20 13:55:05 crc kubenswrapper[4895]: I0320 13:55:05.045288 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" event={"ID":"9fde50f5-9c7a-4737-9d42-f6df58df9629","Type":"ContainerDied","Data":"eba882b27b6826dc0e3be6e7170b5bc4d8c05df94c7ad8d6427b709294aa501c"} Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.632663 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.755254 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77z5n\" (UniqueName: \"kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n\") pod \"9fde50f5-9c7a-4737-9d42-f6df58df9629\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.755584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam\") pod \"9fde50f5-9c7a-4737-9d42-f6df58df9629\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.755927 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory\") pod \"9fde50f5-9c7a-4737-9d42-f6df58df9629\" (UID: \"9fde50f5-9c7a-4737-9d42-f6df58df9629\") " Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.760783 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n" (OuterVolumeSpecName: "kube-api-access-77z5n") pod "9fde50f5-9c7a-4737-9d42-f6df58df9629" (UID: "9fde50f5-9c7a-4737-9d42-f6df58df9629"). InnerVolumeSpecName "kube-api-access-77z5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.784748 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory" (OuterVolumeSpecName: "inventory") pod "9fde50f5-9c7a-4737-9d42-f6df58df9629" (UID: "9fde50f5-9c7a-4737-9d42-f6df58df9629"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.802499 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9fde50f5-9c7a-4737-9d42-f6df58df9629" (UID: "9fde50f5-9c7a-4737-9d42-f6df58df9629"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.859083 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.859114 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77z5n\" (UniqueName: \"kubernetes.io/projected/9fde50f5-9c7a-4737-9d42-f6df58df9629-kube-api-access-77z5n\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:06 crc kubenswrapper[4895]: I0320 13:55:06.859127 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9fde50f5-9c7a-4737-9d42-f6df58df9629-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.084938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" event={"ID":"9fde50f5-9c7a-4737-9d42-f6df58df9629","Type":"ContainerDied","Data":"96ebd696e2463b29d15b06cea28b76beef40e61851efec017f61483a9a1c7411"} Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.084987 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ebd696e2463b29d15b06cea28b76beef40e61851efec017f61483a9a1c7411" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.085007 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-txfsl" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.113712 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.113790 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.146185 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb"] Mar 20 13:55:07 crc kubenswrapper[4895]: E0320 13:55:07.146690 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fde50f5-9c7a-4737-9d42-f6df58df9629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.146707 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fde50f5-9c7a-4737-9d42-f6df58df9629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.146907 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fde50f5-9c7a-4737-9d42-f6df58df9629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.147746 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.151988 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.152031 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.152119 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.152214 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.166301 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb"] Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.267469 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.267519 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vjx\" (UniqueName: \"kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.267895 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.369884 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.370020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.370048 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vjx\" (UniqueName: \"kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.373679 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.375331 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.388078 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vjx\" (UniqueName: \"kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:07 crc kubenswrapper[4895]: I0320 13:55:07.522745 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:08 crc kubenswrapper[4895]: I0320 13:55:08.100901 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb"] Mar 20 13:55:08 crc kubenswrapper[4895]: I0320 13:55:08.168773 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c82f7" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:08 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:08 crc kubenswrapper[4895]: > Mar 20 13:55:08 crc kubenswrapper[4895]: I0320 13:55:08.343018 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qn2cp" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:08 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:08 crc kubenswrapper[4895]: > Mar 20 13:55:09 crc kubenswrapper[4895]: I0320 13:55:09.103288 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" event={"ID":"1a2a0abe-d614-4f65-b832-06b9ddbdef54","Type":"ContainerStarted","Data":"78b1596127121835e26daf103b2942a801494a5256eadf6bd1a2c3b4e05173d4"} Mar 20 13:55:09 crc kubenswrapper[4895]: I0320 13:55:09.103609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" event={"ID":"1a2a0abe-d614-4f65-b832-06b9ddbdef54","Type":"ContainerStarted","Data":"dde5cb84de15d72b29a364d2ec8ec279510bd5960c32a88e3aa594fde0f86ded"} Mar 20 13:55:09 crc kubenswrapper[4895]: I0320 13:55:09.123031 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" podStartSLOduration=1.6421081659999999 podStartE2EDuration="2.123010348s" podCreationTimestamp="2026-03-20 13:55:07 +0000 UTC" firstStartedPulling="2026-03-20 13:55:08.095200957 +0000 UTC m=+2007.604919923" lastFinishedPulling="2026-03-20 13:55:08.576103139 +0000 UTC m=+2008.085822105" observedRunningTime="2026-03-20 13:55:09.119666336 +0000 UTC m=+2008.629385302" watchObservedRunningTime="2026-03-20 13:55:09.123010348 +0000 UTC m=+2008.632729314" Mar 20 13:55:09 crc kubenswrapper[4895]: I0320 13:55:09.778873 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:55:09 crc kubenswrapper[4895]: I0320 13:55:09.837501 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:55:10 crc kubenswrapper[4895]: I0320 13:55:10.020153 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.044239 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vxtht"] Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.059469 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vxtht"] Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.119975 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w4gk2" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="registry-server" containerID="cri-o://64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929" gracePeriod=2 Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.223264 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293e033c-47da-4d3e-af29-088700965fc1" path="/var/lib/kubelet/pods/293e033c-47da-4d3e-af29-088700965fc1/volumes" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.761550 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.875065 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities\") pod \"53f9655e-3d95-404c-9699-a1df11b6197e\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.875463 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content\") pod \"53f9655e-3d95-404c-9699-a1df11b6197e\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.875526 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw28q\" (UniqueName: \"kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q\") pod \"53f9655e-3d95-404c-9699-a1df11b6197e\" (UID: \"53f9655e-3d95-404c-9699-a1df11b6197e\") " Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.875919 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities" (OuterVolumeSpecName: "utilities") pod "53f9655e-3d95-404c-9699-a1df11b6197e" (UID: "53f9655e-3d95-404c-9699-a1df11b6197e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.876076 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.882042 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q" (OuterVolumeSpecName: "kube-api-access-lw28q") pod "53f9655e-3d95-404c-9699-a1df11b6197e" (UID: "53f9655e-3d95-404c-9699-a1df11b6197e"). InnerVolumeSpecName "kube-api-access-lw28q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.930049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53f9655e-3d95-404c-9699-a1df11b6197e" (UID: "53f9655e-3d95-404c-9699-a1df11b6197e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.977680 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f9655e-3d95-404c-9699-a1df11b6197e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:11 crc kubenswrapper[4895]: I0320 13:55:11.977714 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw28q\" (UniqueName: \"kubernetes.io/projected/53f9655e-3d95-404c-9699-a1df11b6197e-kube-api-access-lw28q\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.130847 4895 generic.go:334] "Generic (PLEG): container finished" podID="53f9655e-3d95-404c-9699-a1df11b6197e" containerID="64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929" exitCode=0 Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.130892 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerDied","Data":"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929"} Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.130933 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w4gk2" event={"ID":"53f9655e-3d95-404c-9699-a1df11b6197e","Type":"ContainerDied","Data":"72ccd20a04ac97ab762c7d3181ed3cda0ec6726156301acd6ae97c020f512f19"} Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.130951 4895 scope.go:117] "RemoveContainer" containerID="64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.131815 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w4gk2" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.159234 4895 scope.go:117] "RemoveContainer" containerID="6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.167802 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.177510 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w4gk2"] Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.185501 4895 scope.go:117] "RemoveContainer" containerID="47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.238091 4895 scope.go:117] "RemoveContainer" containerID="64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929" Mar 20 13:55:12 crc kubenswrapper[4895]: E0320 13:55:12.239691 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929\": container with ID starting with 64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929 not found: ID does not exist" containerID="64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.239749 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929"} err="failed to get container status \"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929\": rpc error: code = NotFound desc = could not find container \"64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929\": container with ID starting with 64d454be341ca7bf4fa5f7d38f2018a566bfe0b448f1c5c1756a602da599f929 not found: ID does not exist" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.239778 4895 scope.go:117] "RemoveContainer" containerID="6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e" Mar 20 13:55:12 crc kubenswrapper[4895]: E0320 13:55:12.240354 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e\": container with ID starting with 6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e not found: ID does not exist" containerID="6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.240461 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e"} err="failed to get container status \"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e\": rpc error: code = NotFound desc = could not find container \"6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e\": container with ID starting with 6c43d002580429215724d1ead8faea9206628ae8bb08da8c35745d0bd768ad7e not found: ID does not exist" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.240544 4895 scope.go:117] "RemoveContainer" containerID="47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba" Mar 20 13:55:12 crc kubenswrapper[4895]: E0320 13:55:12.240992 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba\": container with ID starting with 47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba not found: ID does not exist" containerID="47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba" Mar 20 13:55:12 crc kubenswrapper[4895]: I0320 13:55:12.241018 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba"} err="failed to get container status \"47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba\": rpc error: code = NotFound desc = could not find container \"47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba\": container with ID starting with 47539ad756f458bb95f3a6f1eefec6409cba96f0152735fec0d001f57a5606ba not found: ID does not exist" Mar 20 13:55:13 crc kubenswrapper[4895]: I0320 13:55:13.226731 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" path="/var/lib/kubelet/pods/53f9655e-3d95-404c-9699-a1df11b6197e/volumes" Mar 20 13:55:17 crc kubenswrapper[4895]: I0320 13:55:17.348572 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:55:17 crc kubenswrapper[4895]: I0320 13:55:17.400594 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:55:17 crc kubenswrapper[4895]: I0320 13:55:17.974965 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.159498 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c82f7" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:18 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:18 crc kubenswrapper[4895]: > Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.186924 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a2a0abe-d614-4f65-b832-06b9ddbdef54" containerID="78b1596127121835e26daf103b2942a801494a5256eadf6bd1a2c3b4e05173d4" exitCode=0 Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.187016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" event={"ID":"1a2a0abe-d614-4f65-b832-06b9ddbdef54","Type":"ContainerDied","Data":"78b1596127121835e26daf103b2942a801494a5256eadf6bd1a2c3b4e05173d4"} Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.893579 4895 scope.go:117] "RemoveContainer" containerID="bbdd1d1f96c305553c0cdb619b80965b4410e25181f8fb5a9566ba1154feba29" Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.940381 4895 scope.go:117] "RemoveContainer" containerID="c09da54244a40e429af0155a9927e13121445bcdc6fc0a3d2259bb590d5d6e30" Mar 20 13:55:18 crc kubenswrapper[4895]: I0320 13:55:18.997840 4895 scope.go:117] "RemoveContainer" containerID="b85da950b26165d40e7a5917881f1c1c1993e9abc3472d57087bb6351ded117d" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.198894 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qn2cp" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" containerID="cri-o://c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab" gracePeriod=2 Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.733836 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.759649 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.856584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content\") pod \"172d6045-7f8e-45ca-b75b-c12f0779e686\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.856714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam\") pod \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.856754 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities\") pod \"172d6045-7f8e-45ca-b75b-c12f0779e686\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.856834 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8vjx\" (UniqueName: \"kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx\") pod \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.857380 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities" (OuterVolumeSpecName: "utilities") pod "172d6045-7f8e-45ca-b75b-c12f0779e686" (UID: "172d6045-7f8e-45ca-b75b-c12f0779e686"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.857770 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcwm9\" (UniqueName: \"kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9\") pod \"172d6045-7f8e-45ca-b75b-c12f0779e686\" (UID: \"172d6045-7f8e-45ca-b75b-c12f0779e686\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.857822 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory\") pod \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\" (UID: \"1a2a0abe-d614-4f65-b832-06b9ddbdef54\") " Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.858535 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.863141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9" (OuterVolumeSpecName: "kube-api-access-pcwm9") pod "172d6045-7f8e-45ca-b75b-c12f0779e686" (UID: "172d6045-7f8e-45ca-b75b-c12f0779e686"). InnerVolumeSpecName "kube-api-access-pcwm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.863367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx" (OuterVolumeSpecName: "kube-api-access-r8vjx") pod "1a2a0abe-d614-4f65-b832-06b9ddbdef54" (UID: "1a2a0abe-d614-4f65-b832-06b9ddbdef54"). InnerVolumeSpecName "kube-api-access-r8vjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.886519 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory" (OuterVolumeSpecName: "inventory") pod "1a2a0abe-d614-4f65-b832-06b9ddbdef54" (UID: "1a2a0abe-d614-4f65-b832-06b9ddbdef54"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.888343 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1a2a0abe-d614-4f65-b832-06b9ddbdef54" (UID: "1a2a0abe-d614-4f65-b832-06b9ddbdef54"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.907682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "172d6045-7f8e-45ca-b75b-c12f0779e686" (UID: "172d6045-7f8e-45ca-b75b-c12f0779e686"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.960379 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcwm9\" (UniqueName: \"kubernetes.io/projected/172d6045-7f8e-45ca-b75b-c12f0779e686-kube-api-access-pcwm9\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.960447 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.960461 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/172d6045-7f8e-45ca-b75b-c12f0779e686-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.960476 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a2a0abe-d614-4f65-b832-06b9ddbdef54-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:19 crc kubenswrapper[4895]: I0320 13:55:19.960489 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8vjx\" (UniqueName: \"kubernetes.io/projected/1a2a0abe-d614-4f65-b832-06b9ddbdef54-kube-api-access-r8vjx\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.210248 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" event={"ID":"1a2a0abe-d614-4f65-b832-06b9ddbdef54","Type":"ContainerDied","Data":"dde5cb84de15d72b29a364d2ec8ec279510bd5960c32a88e3aa594fde0f86ded"} Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.210301 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde5cb84de15d72b29a364d2ec8ec279510bd5960c32a88e3aa594fde0f86ded" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.210510 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.211946 4895 generic.go:334] "Generic (PLEG): container finished" podID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerID="c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab" exitCode=0 Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.211994 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerDied","Data":"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab"} Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.212022 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn2cp" event={"ID":"172d6045-7f8e-45ca-b75b-c12f0779e686","Type":"ContainerDied","Data":"eacf7e174e0cb626aa3dc28a535fe446aabbf3f3d42b63c1ac599754831d7059"} Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.212038 4895 scope.go:117] "RemoveContainer" containerID="c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.212158 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn2cp" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.273799 4895 scope.go:117] "RemoveContainer" containerID="06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.285326 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.315990 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qn2cp"] Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.364706 4895 scope.go:117] "RemoveContainer" containerID="8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.397299 4895 scope.go:117] "RemoveContainer" containerID="c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.397712 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab\": container with ID starting with c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab not found: ID does not exist" containerID="c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.397745 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab"} err="failed to get container status \"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab\": rpc error: code = NotFound desc = could not find container \"c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab\": container with ID starting with c88611ae7331dc530ec2328157a4110c338dd1a4e46b76cb618f3167cfc545ab not found: ID does not exist" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.397765 4895 scope.go:117] "RemoveContainer" containerID="06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.398107 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0\": container with ID starting with 06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0 not found: ID does not exist" containerID="06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.398128 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0"} err="failed to get container status \"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0\": rpc error: code = NotFound desc = could not find container \"06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0\": container with ID starting with 06d6b8503efb30040b5517672eeefea7bb056e87d724af613e2e94b4802da6f0 not found: ID does not exist" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.398140 4895 scope.go:117] "RemoveContainer" containerID="8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.398334 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c\": container with ID starting with 8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c not found: ID does not exist" containerID="8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.398350 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c"} err="failed to get container status \"8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c\": rpc error: code = NotFound desc = could not find container \"8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c\": container with ID starting with 8dfa7b0af165266f511e91edbc3fb58e231a2c9c01a56264487552451971012c not found: ID does not exist" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.428709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj"] Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429119 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="extract-content" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429135 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="extract-content" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429155 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429161 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429168 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="extract-utilities" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429174 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="extract-utilities" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429185 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a2a0abe-d614-4f65-b832-06b9ddbdef54" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429192 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a2a0abe-d614-4f65-b832-06b9ddbdef54" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429204 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="extract-content" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429210 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="extract-content" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429225 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429231 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: E0320 13:55:20.429243 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="extract-utilities" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429248 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="extract-utilities" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429454 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f9655e-3d95-404c-9699-a1df11b6197e" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429480 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" containerName="registry-server" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.429494 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a2a0abe-d614-4f65-b832-06b9ddbdef54" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.430184 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.435313 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.435519 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.436561 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.436690 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.436820 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.436569 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.438738 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.438890 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.458875 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj"] Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.581790 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.581853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.581888 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.581985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582039 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582178 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582229 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582405 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582498 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582536 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvvj8\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582573 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582607 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.582675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.684896 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.684959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.684980 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.685773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvvj8\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.685869 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.685914 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686032 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686273 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686355 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686498 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686545 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686732 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.686841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.691787 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.691978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.691992 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.692699 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.692862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.693258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.693786 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.694782 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.697255 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.698789 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.699104 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.702533 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.703497 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvvj8\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.703744 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:20 crc kubenswrapper[4895]: I0320 13:55:20.766155 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:21 crc kubenswrapper[4895]: I0320 13:55:21.227550 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="172d6045-7f8e-45ca-b75b-c12f0779e686" path="/var/lib/kubelet/pods/172d6045-7f8e-45ca-b75b-c12f0779e686/volumes" Mar 20 13:55:21 crc kubenswrapper[4895]: I0320 13:55:21.295530 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj"] Mar 20 13:55:22 crc kubenswrapper[4895]: I0320 13:55:22.246976 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" event={"ID":"3edac150-5a84-4c67-8999-f0161dc784ba","Type":"ContainerStarted","Data":"1b614dd646e6b06495d114e07ecbd12e3f40f0b1eef23bccf8e22d24db0d62fb"} Mar 20 13:55:22 crc kubenswrapper[4895]: I0320 13:55:22.247375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" event={"ID":"3edac150-5a84-4c67-8999-f0161dc784ba","Type":"ContainerStarted","Data":"5e97086ab161089beebfa9bf305e27719058bc81f2f95944c093692660776289"} Mar 20 13:55:27 crc kubenswrapper[4895]: I0320 13:55:27.166711 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:27 crc kubenswrapper[4895]: I0320 13:55:27.202275 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" podStartSLOduration=6.597783022 podStartE2EDuration="7.202202039s" podCreationTimestamp="2026-03-20 13:55:20 +0000 UTC" firstStartedPulling="2026-03-20 13:55:21.286636781 +0000 UTC m=+2020.796355747" lastFinishedPulling="2026-03-20 13:55:21.891055788 +0000 UTC m=+2021.400774764" observedRunningTime="2026-03-20 13:55:22.278795718 +0000 UTC m=+2021.788514684" watchObservedRunningTime="2026-03-20 13:55:27.202202039 +0000 UTC m=+2026.711921045" Mar 20 13:55:27 crc kubenswrapper[4895]: I0320 13:55:27.239858 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:27 crc kubenswrapper[4895]: I0320 13:55:27.408207 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:55:28 crc kubenswrapper[4895]: I0320 13:55:28.339129 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c82f7" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" containerID="cri-o://aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285" gracePeriod=2 Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.066035 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.192634 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities\") pod \"3258d433-c990-40f9-9b48-729c3bc7ad30\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.192815 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content\") pod \"3258d433-c990-40f9-9b48-729c3bc7ad30\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.192920 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkck\" (UniqueName: \"kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck\") pod \"3258d433-c990-40f9-9b48-729c3bc7ad30\" (UID: \"3258d433-c990-40f9-9b48-729c3bc7ad30\") " Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.194306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities" (OuterVolumeSpecName: "utilities") pod "3258d433-c990-40f9-9b48-729c3bc7ad30" (UID: "3258d433-c990-40f9-9b48-729c3bc7ad30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.201711 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck" (OuterVolumeSpecName: "kube-api-access-nlkck") pod "3258d433-c990-40f9-9b48-729c3bc7ad30" (UID: "3258d433-c990-40f9-9b48-729c3bc7ad30"). InnerVolumeSpecName "kube-api-access-nlkck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.295555 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkck\" (UniqueName: \"kubernetes.io/projected/3258d433-c990-40f9-9b48-729c3bc7ad30-kube-api-access-nlkck\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.295583 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.348432 4895 generic.go:334] "Generic (PLEG): container finished" podID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerID="aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285" exitCode=0 Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.348472 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerDied","Data":"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285"} Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.348496 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c82f7" event={"ID":"3258d433-c990-40f9-9b48-729c3bc7ad30","Type":"ContainerDied","Data":"a9a50b024036f6d41c7928312ed88d3e9fd8c608a416f570429295475a8dcc2f"} Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.348511 4895 scope.go:117] "RemoveContainer" containerID="aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.348630 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c82f7" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.367040 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3258d433-c990-40f9-9b48-729c3bc7ad30" (UID: "3258d433-c990-40f9-9b48-729c3bc7ad30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.369983 4895 scope.go:117] "RemoveContainer" containerID="aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.397769 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3258d433-c990-40f9-9b48-729c3bc7ad30-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.405130 4895 scope.go:117] "RemoveContainer" containerID="4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.458161 4895 scope.go:117] "RemoveContainer" containerID="aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285" Mar 20 13:55:29 crc kubenswrapper[4895]: E0320 13:55:29.458908 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285\": container with ID starting with aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285 not found: ID does not exist" containerID="aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.458965 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285"} err="failed to get container status \"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285\": rpc error: code = NotFound desc = could not find container \"aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285\": container with ID starting with aa380c36e72fba9c75fadd55c47326efb2630e9e70bba533d146b904c2217285 not found: ID does not exist" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.459000 4895 scope.go:117] "RemoveContainer" containerID="aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc" Mar 20 13:55:29 crc kubenswrapper[4895]: E0320 13:55:29.459359 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc\": container with ID starting with aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc not found: ID does not exist" containerID="aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.459417 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc"} err="failed to get container status \"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc\": rpc error: code = NotFound desc = could not find container \"aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc\": container with ID starting with aff1ba195f470f38c5c718e48e27d5f14f1f08554c9d1eb93ef0f9aba10f80cc not found: ID does not exist" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.459449 4895 scope.go:117] "RemoveContainer" containerID="4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f" Mar 20 13:55:29 crc kubenswrapper[4895]: E0320 13:55:29.459865 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f\": container with ID starting with 4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f not found: ID does not exist" containerID="4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.459899 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f"} err="failed to get container status \"4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f\": rpc error: code = NotFound desc = could not find container \"4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f\": container with ID starting with 4d617b8d74b523da23f912e3f47d681f71f9400c2b9d709e92f16cab8edc574f not found: ID does not exist" Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.685695 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:55:29 crc kubenswrapper[4895]: I0320 13:55:29.693383 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c82f7"] Mar 20 13:55:31 crc kubenswrapper[4895]: I0320 13:55:31.224795 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" path="/var/lib/kubelet/pods/3258d433-c990-40f9-9b48-729c3bc7ad30/volumes" Mar 20 13:55:52 crc kubenswrapper[4895]: I0320 13:55:52.297153 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:55:52 crc kubenswrapper[4895]: I0320 13:55:52.297685 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:55:56 crc kubenswrapper[4895]: I0320 13:55:56.631707 4895 generic.go:334] "Generic (PLEG): container finished" podID="3edac150-5a84-4c67-8999-f0161dc784ba" containerID="1b614dd646e6b06495d114e07ecbd12e3f40f0b1eef23bccf8e22d24db0d62fb" exitCode=0 Mar 20 13:55:56 crc kubenswrapper[4895]: I0320 13:55:56.631787 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" event={"ID":"3edac150-5a84-4c67-8999-f0161dc784ba","Type":"ContainerDied","Data":"1b614dd646e6b06495d114e07ecbd12e3f40f0b1eef23bccf8e22d24db0d62fb"} Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.093300 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135302 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135686 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135781 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135810 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135836 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135873 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135927 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.135949 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136032 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136056 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136146 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136182 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136216 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.136253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvvj8\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8\") pod \"3edac150-5a84-4c67-8999-f0161dc784ba\" (UID: \"3edac150-5a84-4c67-8999-f0161dc784ba\") " Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.143280 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.150264 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8" (OuterVolumeSpecName: "kube-api-access-rvvj8") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "kube-api-access-rvvj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.153641 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.155450 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.155691 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.156209 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.157687 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.159611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.159906 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.160404 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.160382 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.163226 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.185896 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.189916 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory" (OuterVolumeSpecName: "inventory") pod "3edac150-5a84-4c67-8999-f0161dc784ba" (UID: "3edac150-5a84-4c67-8999-f0161dc784ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238363 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238446 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238461 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238474 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238488 4895 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238500 4895 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238515 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238531 4895 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238544 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238558 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvvj8\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-kube-api-access-rvvj8\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238569 4895 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238581 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3edac150-5a84-4c67-8999-f0161dc784ba-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238593 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.238606 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edac150-5a84-4c67-8999-f0161dc784ba-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.650610 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" event={"ID":"3edac150-5a84-4c67-8999-f0161dc784ba","Type":"ContainerDied","Data":"5e97086ab161089beebfa9bf305e27719058bc81f2f95944c093692660776289"} Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.650670 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e97086ab161089beebfa9bf305e27719058bc81f2f95944c093692660776289" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.650640 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.768436 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4"] Mar 20 13:55:58 crc kubenswrapper[4895]: E0320 13:55:58.769065 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="extract-content" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769134 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="extract-content" Mar 20 13:55:58 crc kubenswrapper[4895]: E0320 13:55:58.769224 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="extract-utilities" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769277 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="extract-utilities" Mar 20 13:55:58 crc kubenswrapper[4895]: E0320 13:55:58.769347 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edac150-5a84-4c67-8999-f0161dc784ba" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769419 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edac150-5a84-4c67-8999-f0161dc784ba" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:58 crc kubenswrapper[4895]: E0320 13:55:58.769481 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769530 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769779 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edac150-5a84-4c67-8999-f0161dc784ba" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.769844 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3258d433-c990-40f9-9b48-729c3bc7ad30" containerName="registry-server" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.770599 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.772944 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.773130 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.773250 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.773562 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.773882 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.788698 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4"] Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.851404 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.851491 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.851545 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.851575 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.851671 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrwrq\" (UniqueName: \"kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.953257 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.953321 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.953355 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.953460 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrwrq\" (UniqueName: \"kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.953511 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.954261 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.957770 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.959232 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.960044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:58 crc kubenswrapper[4895]: I0320 13:55:58.980349 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrwrq\" (UniqueName: \"kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z9xg4\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:59 crc kubenswrapper[4895]: I0320 13:55:59.088859 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:55:59 crc kubenswrapper[4895]: I0320 13:55:59.614706 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4"] Mar 20 13:55:59 crc kubenswrapper[4895]: I0320 13:55:59.662847 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" event={"ID":"d67ed2a1-5121-41c0-a5d9-3962837f0cb2","Type":"ContainerStarted","Data":"49ea5b2d5306341658f09aa8b96949511e20e965d975abafe01f7c091f3b1637"} Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.133593 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-pphbn"] Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.135115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.139931 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.139963 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.140235 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.143471 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-pphbn"] Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.177981 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9674\" (UniqueName: \"kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674\") pod \"auto-csr-approver-29566916-pphbn\" (UID: \"73b1bb13-ebec-4961-afb7-c14cba59bd99\") " pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.280342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9674\" (UniqueName: \"kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674\") pod \"auto-csr-approver-29566916-pphbn\" (UID: \"73b1bb13-ebec-4961-afb7-c14cba59bd99\") " pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.311522 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9674\" (UniqueName: \"kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674\") pod \"auto-csr-approver-29566916-pphbn\" (UID: \"73b1bb13-ebec-4961-afb7-c14cba59bd99\") " pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.462813 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:00 crc kubenswrapper[4895]: I0320 13:56:00.959139 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-pphbn"] Mar 20 13:56:00 crc kubenswrapper[4895]: W0320 13:56:00.966140 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73b1bb13_ebec_4961_afb7_c14cba59bd99.slice/crio-6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b WatchSource:0}: Error finding container 6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b: Status 404 returned error can't find the container with id 6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b Mar 20 13:56:01 crc kubenswrapper[4895]: I0320 13:56:01.692694 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-pphbn" event={"ID":"73b1bb13-ebec-4961-afb7-c14cba59bd99","Type":"ContainerStarted","Data":"6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b"} Mar 20 13:56:01 crc kubenswrapper[4895]: I0320 13:56:01.694479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" event={"ID":"d67ed2a1-5121-41c0-a5d9-3962837f0cb2","Type":"ContainerStarted","Data":"3da93ad27273c1195ac896c3ce9360260f9195e0a414c6ea3a682b08d932e31c"} Mar 20 13:56:01 crc kubenswrapper[4895]: I0320 13:56:01.739047 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" podStartSLOduration=2.7325698149999997 podStartE2EDuration="3.739022843s" podCreationTimestamp="2026-03-20 13:55:58 +0000 UTC" firstStartedPulling="2026-03-20 13:55:59.614989165 +0000 UTC m=+2059.124708141" lastFinishedPulling="2026-03-20 13:56:00.621442213 +0000 UTC m=+2060.131161169" observedRunningTime="2026-03-20 13:56:01.720277774 +0000 UTC m=+2061.229996780" watchObservedRunningTime="2026-03-20 13:56:01.739022843 +0000 UTC m=+2061.248741819" Mar 20 13:56:02 crc kubenswrapper[4895]: I0320 13:56:02.704802 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-pphbn" event={"ID":"73b1bb13-ebec-4961-afb7-c14cba59bd99","Type":"ContainerStarted","Data":"e1c43fa8ff8a386234aeab037db15eb7c5d66576c5986243a27b7f5bc3c4c444"} Mar 20 13:56:02 crc kubenswrapper[4895]: E0320 13:56:02.976819 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73b1bb13_ebec_4961_afb7_c14cba59bd99.slice/crio-e1c43fa8ff8a386234aeab037db15eb7c5d66576c5986243a27b7f5bc3c4c444.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:56:03 crc kubenswrapper[4895]: I0320 13:56:03.716129 4895 generic.go:334] "Generic (PLEG): container finished" podID="73b1bb13-ebec-4961-afb7-c14cba59bd99" containerID="e1c43fa8ff8a386234aeab037db15eb7c5d66576c5986243a27b7f5bc3c4c444" exitCode=0 Mar 20 13:56:03 crc kubenswrapper[4895]: I0320 13:56:03.716222 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-pphbn" event={"ID":"73b1bb13-ebec-4961-afb7-c14cba59bd99","Type":"ContainerDied","Data":"e1c43fa8ff8a386234aeab037db15eb7c5d66576c5986243a27b7f5bc3c4c444"} Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.127242 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.190201 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9674\" (UniqueName: \"kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674\") pod \"73b1bb13-ebec-4961-afb7-c14cba59bd99\" (UID: \"73b1bb13-ebec-4961-afb7-c14cba59bd99\") " Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.196256 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674" (OuterVolumeSpecName: "kube-api-access-q9674") pod "73b1bb13-ebec-4961-afb7-c14cba59bd99" (UID: "73b1bb13-ebec-4961-afb7-c14cba59bd99"). InnerVolumeSpecName "kube-api-access-q9674". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.293724 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9674\" (UniqueName: \"kubernetes.io/projected/73b1bb13-ebec-4961-afb7-c14cba59bd99-kube-api-access-q9674\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.738626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-pphbn" event={"ID":"73b1bb13-ebec-4961-afb7-c14cba59bd99","Type":"ContainerDied","Data":"6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b"} Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.738686 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6975d734d2bd9c6bd422f2d63266434520d5dccf70ca8091fd9b3f373a7d135b" Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.738788 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-pphbn" Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.795200 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-fxn58"] Mar 20 13:56:05 crc kubenswrapper[4895]: I0320 13:56:05.804366 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-fxn58"] Mar 20 13:56:07 crc kubenswrapper[4895]: I0320 13:56:07.224968 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c47434-e191-4697-83c8-6bd904c9a2c7" path="/var/lib/kubelet/pods/b2c47434-e191-4697-83c8-6bd904c9a2c7/volumes" Mar 20 13:56:19 crc kubenswrapper[4895]: I0320 13:56:19.112558 4895 scope.go:117] "RemoveContainer" containerID="95d81e8beb0e84e1165f14f28fe92796bd058b15bb3c7c946e84a36e0b62b075" Mar 20 13:56:22 crc kubenswrapper[4895]: I0320 13:56:22.296903 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:22 crc kubenswrapper[4895]: I0320 13:56:22.297528 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:29 crc kubenswrapper[4895]: I0320 13:56:29.056156 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-d69kx"] Mar 20 13:56:29 crc kubenswrapper[4895]: I0320 13:56:29.067789 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-d69kx"] Mar 20 13:56:29 crc kubenswrapper[4895]: I0320 13:56:29.228987 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb5ee4b2-1013-4687-b3aa-df5362f4b435" path="/var/lib/kubelet/pods/bb5ee4b2-1013-4687-b3aa-df5362f4b435/volumes" Mar 20 13:56:35 crc kubenswrapper[4895]: I0320 13:56:35.026005 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-8j7fc"] Mar 20 13:56:35 crc kubenswrapper[4895]: I0320 13:56:35.036040 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-8j7fc"] Mar 20 13:56:35 crc kubenswrapper[4895]: I0320 13:56:35.222031 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9547e88e-4e6b-4034-a86e-d8145d5257e1" path="/var/lib/kubelet/pods/9547e88e-4e6b-4034-a86e-d8145d5257e1/volumes" Mar 20 13:56:52 crc kubenswrapper[4895]: I0320 13:56:52.297575 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:52 crc kubenswrapper[4895]: I0320 13:56:52.298239 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:52 crc kubenswrapper[4895]: I0320 13:56:52.298292 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:56:52 crc kubenswrapper[4895]: I0320 13:56:52.299119 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:56:52 crc kubenswrapper[4895]: I0320 13:56:52.299181 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569" gracePeriod=600 Mar 20 13:56:53 crc kubenswrapper[4895]: I0320 13:56:53.398228 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569" exitCode=0 Mar 20 13:56:53 crc kubenswrapper[4895]: I0320 13:56:53.398302 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569"} Mar 20 13:56:53 crc kubenswrapper[4895]: I0320 13:56:53.398746 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8"} Mar 20 13:56:53 crc kubenswrapper[4895]: I0320 13:56:53.398764 4895 scope.go:117] "RemoveContainer" containerID="263eb64c38ed50eb0ba343e0b64dfb6d90338421ef49a7f98457b6570f3d4732" Mar 20 13:56:59 crc kubenswrapper[4895]: I0320 13:56:59.459535 4895 generic.go:334] "Generic (PLEG): container finished" podID="d67ed2a1-5121-41c0-a5d9-3962837f0cb2" containerID="3da93ad27273c1195ac896c3ce9360260f9195e0a414c6ea3a682b08d932e31c" exitCode=0 Mar 20 13:56:59 crc kubenswrapper[4895]: I0320 13:56:59.459638 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" event={"ID":"d67ed2a1-5121-41c0-a5d9-3962837f0cb2","Type":"ContainerDied","Data":"3da93ad27273c1195ac896c3ce9360260f9195e0a414c6ea3a682b08d932e31c"} Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.023990 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.178583 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle\") pod \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.178743 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0\") pod \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.178838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrwrq\" (UniqueName: \"kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq\") pod \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.178953 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory\") pod \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.179078 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam\") pod \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\" (UID: \"d67ed2a1-5121-41c0-a5d9-3962837f0cb2\") " Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.185218 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq" (OuterVolumeSpecName: "kube-api-access-qrwrq") pod "d67ed2a1-5121-41c0-a5d9-3962837f0cb2" (UID: "d67ed2a1-5121-41c0-a5d9-3962837f0cb2"). InnerVolumeSpecName "kube-api-access-qrwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.185425 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d67ed2a1-5121-41c0-a5d9-3962837f0cb2" (UID: "d67ed2a1-5121-41c0-a5d9-3962837f0cb2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.205580 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d67ed2a1-5121-41c0-a5d9-3962837f0cb2" (UID: "d67ed2a1-5121-41c0-a5d9-3962837f0cb2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.208629 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d67ed2a1-5121-41c0-a5d9-3962837f0cb2" (UID: "d67ed2a1-5121-41c0-a5d9-3962837f0cb2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.229675 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory" (OuterVolumeSpecName: "inventory") pod "d67ed2a1-5121-41c0-a5d9-3962837f0cb2" (UID: "d67ed2a1-5121-41c0-a5d9-3962837f0cb2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.282725 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.282758 4895 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.282831 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrwrq\" (UniqueName: \"kubernetes.io/projected/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-kube-api-access-qrwrq\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.282846 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.282916 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d67ed2a1-5121-41c0-a5d9-3962837f0cb2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.481255 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" event={"ID":"d67ed2a1-5121-41c0-a5d9-3962837f0cb2","Type":"ContainerDied","Data":"49ea5b2d5306341658f09aa8b96949511e20e965d975abafe01f7c091f3b1637"} Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.481621 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49ea5b2d5306341658f09aa8b96949511e20e965d975abafe01f7c091f3b1637" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.481320 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z9xg4" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.613835 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5"] Mar 20 13:57:01 crc kubenswrapper[4895]: E0320 13:57:01.614295 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b1bb13-ebec-4961-afb7-c14cba59bd99" containerName="oc" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.614321 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b1bb13-ebec-4961-afb7-c14cba59bd99" containerName="oc" Mar 20 13:57:01 crc kubenswrapper[4895]: E0320 13:57:01.614346 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67ed2a1-5121-41c0-a5d9-3962837f0cb2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.614355 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67ed2a1-5121-41c0-a5d9-3962837f0cb2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.614626 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b1bb13-ebec-4961-afb7-c14cba59bd99" containerName="oc" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.614657 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67ed2a1-5121-41c0-a5d9-3962837f0cb2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.615528 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.619870 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.620324 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.620342 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.620500 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.621540 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.621849 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.641826 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5"] Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.805644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.805771 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.805914 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.806086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64nvw\" (UniqueName: \"kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.806156 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.806204 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908443 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908534 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908581 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64nvw\" (UniqueName: \"kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908649 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.908680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.913139 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.913868 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.914257 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.915204 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.916060 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.928103 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64nvw\" (UniqueName: \"kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:01 crc kubenswrapper[4895]: I0320 13:57:01.936086 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:02 crc kubenswrapper[4895]: I0320 13:57:02.498226 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5"] Mar 20 13:57:02 crc kubenswrapper[4895]: W0320 13:57:02.520091 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc06d8c6_f0e5_4555_84d0_e67d1a358e18.slice/crio-569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17 WatchSource:0}: Error finding container 569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17: Status 404 returned error can't find the container with id 569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17 Mar 20 13:57:02 crc kubenswrapper[4895]: I0320 13:57:02.521834 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:57:03 crc kubenswrapper[4895]: I0320 13:57:03.522904 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" event={"ID":"dc06d8c6-f0e5-4555-84d0-e67d1a358e18","Type":"ContainerStarted","Data":"f623fab93d8f0e0cb789c4bf40e5f5d3d597344fd6f82d48a8da68092e1aa51b"} Mar 20 13:57:03 crc kubenswrapper[4895]: I0320 13:57:03.523331 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" event={"ID":"dc06d8c6-f0e5-4555-84d0-e67d1a358e18","Type":"ContainerStarted","Data":"569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17"} Mar 20 13:57:03 crc kubenswrapper[4895]: I0320 13:57:03.551692 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" podStartSLOduration=2.086391624 podStartE2EDuration="2.551667072s" podCreationTimestamp="2026-03-20 13:57:01 +0000 UTC" firstStartedPulling="2026-03-20 13:57:02.521540745 +0000 UTC m=+2122.031259711" lastFinishedPulling="2026-03-20 13:57:02.986816193 +0000 UTC m=+2122.496535159" observedRunningTime="2026-03-20 13:57:03.539940915 +0000 UTC m=+2123.049659881" watchObservedRunningTime="2026-03-20 13:57:03.551667072 +0000 UTC m=+2123.061386048" Mar 20 13:57:19 crc kubenswrapper[4895]: I0320 13:57:19.233794 4895 scope.go:117] "RemoveContainer" containerID="426631ec71ba496fbd3a725082cb1a64b438daaad90ae43f5d881da6219047f0" Mar 20 13:57:19 crc kubenswrapper[4895]: I0320 13:57:19.278037 4895 scope.go:117] "RemoveContainer" containerID="83b93b75d955dfefd2bfcebf3d45c221ddbe20c674c5c90be7b02296d67e3358" Mar 20 13:57:46 crc kubenswrapper[4895]: I0320 13:57:46.968220 4895 generic.go:334] "Generic (PLEG): container finished" podID="dc06d8c6-f0e5-4555-84d0-e67d1a358e18" containerID="f623fab93d8f0e0cb789c4bf40e5f5d3d597344fd6f82d48a8da68092e1aa51b" exitCode=0 Mar 20 13:57:46 crc kubenswrapper[4895]: I0320 13:57:46.968278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" event={"ID":"dc06d8c6-f0e5-4555-84d0-e67d1a358e18","Type":"ContainerDied","Data":"f623fab93d8f0e0cb789c4bf40e5f5d3d597344fd6f82d48a8da68092e1aa51b"} Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.458640 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536329 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536605 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536672 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64nvw\" (UniqueName: \"kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536718 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.536744 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle\") pod \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\" (UID: \"dc06d8c6-f0e5-4555-84d0-e67d1a358e18\") " Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.554195 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.554380 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw" (OuterVolumeSpecName: "kube-api-access-64nvw") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "kube-api-access-64nvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.566270 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory" (OuterVolumeSpecName: "inventory") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.571697 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.575925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.578672 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dc06d8c6-f0e5-4555-84d0-e67d1a358e18" (UID: "dc06d8c6-f0e5-4555-84d0-e67d1a358e18"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639545 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639585 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639598 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639612 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64nvw\" (UniqueName: \"kubernetes.io/projected/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-kube-api-access-64nvw\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639626 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:48 crc kubenswrapper[4895]: I0320 13:57:48.639652 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc06d8c6-f0e5-4555-84d0-e67d1a358e18-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:48.999596 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" event={"ID":"dc06d8c6-f0e5-4555-84d0-e67d1a358e18","Type":"ContainerDied","Data":"569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17"} Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.000141 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569ee268d8e9eb05774136bb6fd2458d4cabad76aaf8f963ff098407ab162a17" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.000222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.093460 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg"] Mar 20 13:57:49 crc kubenswrapper[4895]: E0320 13:57:49.093976 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc06d8c6-f0e5-4555-84d0-e67d1a358e18" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.094002 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc06d8c6-f0e5-4555-84d0-e67d1a358e18" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.094263 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc06d8c6-f0e5-4555-84d0-e67d1a358e18" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.095066 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.098054 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.098364 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.098980 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.099253 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.101955 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.108004 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg"] Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.150514 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.150615 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.150673 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.150710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54fwl\" (UniqueName: \"kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.150750 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.252311 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54fwl\" (UniqueName: \"kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.252380 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.252600 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.252659 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.252689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.257300 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.258375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.259847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.266284 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.273577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54fwl\" (UniqueName: \"kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.412876 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 13:57:49 crc kubenswrapper[4895]: I0320 13:57:49.952674 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg"] Mar 20 13:57:50 crc kubenswrapper[4895]: I0320 13:57:50.009273 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" event={"ID":"584873be-8282-406d-9a6a-2abb61f6d3bd","Type":"ContainerStarted","Data":"b450fe6a4bc3165461cf259d8efd8c6adf38d5c573047400ac842ae20d54cb97"} Mar 20 13:57:51 crc kubenswrapper[4895]: I0320 13:57:51.023706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" event={"ID":"584873be-8282-406d-9a6a-2abb61f6d3bd","Type":"ContainerStarted","Data":"e597e8d3a6aaf2de1d035815a641c2ffa27d7291c179e69b59675413d2c4b8ee"} Mar 20 13:57:51 crc kubenswrapper[4895]: I0320 13:57:51.042995 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" podStartSLOduration=1.6081050609999998 podStartE2EDuration="2.042979785s" podCreationTimestamp="2026-03-20 13:57:49 +0000 UTC" firstStartedPulling="2026-03-20 13:57:49.952685063 +0000 UTC m=+2169.462404029" lastFinishedPulling="2026-03-20 13:57:50.387559787 +0000 UTC m=+2169.897278753" observedRunningTime="2026-03-20 13:57:51.042441391 +0000 UTC m=+2170.552160397" watchObservedRunningTime="2026-03-20 13:57:51.042979785 +0000 UTC m=+2170.552698751" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.164583 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566918-89qqb"] Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.167701 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.171049 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.171255 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.180118 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.183846 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-89qqb"] Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.250545 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgd5p\" (UniqueName: \"kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p\") pod \"auto-csr-approver-29566918-89qqb\" (UID: \"fc89b85a-ad36-4408-a21e-c83299aa045b\") " pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.353043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgd5p\" (UniqueName: \"kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p\") pod \"auto-csr-approver-29566918-89qqb\" (UID: \"fc89b85a-ad36-4408-a21e-c83299aa045b\") " pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.380597 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgd5p\" (UniqueName: \"kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p\") pod \"auto-csr-approver-29566918-89qqb\" (UID: \"fc89b85a-ad36-4408-a21e-c83299aa045b\") " pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.499327 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:00 crc kubenswrapper[4895]: I0320 13:58:00.997629 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-89qqb"] Mar 20 13:58:01 crc kubenswrapper[4895]: I0320 13:58:01.140189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-89qqb" event={"ID":"fc89b85a-ad36-4408-a21e-c83299aa045b","Type":"ContainerStarted","Data":"a27bfbee2b3c5c0703208332377c8eb906ec671bdfa77a4c03a3d36c6e6ff0ca"} Mar 20 13:58:03 crc kubenswrapper[4895]: I0320 13:58:03.161362 4895 generic.go:334] "Generic (PLEG): container finished" podID="fc89b85a-ad36-4408-a21e-c83299aa045b" containerID="fd880478f1729cc6de52a360937ab6a2315c353b5da97495e3826f4658a5ef00" exitCode=0 Mar 20 13:58:03 crc kubenswrapper[4895]: I0320 13:58:03.161424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-89qqb" event={"ID":"fc89b85a-ad36-4408-a21e-c83299aa045b","Type":"ContainerDied","Data":"fd880478f1729cc6de52a360937ab6a2315c353b5da97495e3826f4658a5ef00"} Mar 20 13:58:04 crc kubenswrapper[4895]: I0320 13:58:04.585444 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:04 crc kubenswrapper[4895]: I0320 13:58:04.758215 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgd5p\" (UniqueName: \"kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p\") pod \"fc89b85a-ad36-4408-a21e-c83299aa045b\" (UID: \"fc89b85a-ad36-4408-a21e-c83299aa045b\") " Mar 20 13:58:04 crc kubenswrapper[4895]: I0320 13:58:04.764931 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p" (OuterVolumeSpecName: "kube-api-access-zgd5p") pod "fc89b85a-ad36-4408-a21e-c83299aa045b" (UID: "fc89b85a-ad36-4408-a21e-c83299aa045b"). InnerVolumeSpecName "kube-api-access-zgd5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:04 crc kubenswrapper[4895]: I0320 13:58:04.860472 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgd5p\" (UniqueName: \"kubernetes.io/projected/fc89b85a-ad36-4408-a21e-c83299aa045b-kube-api-access-zgd5p\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:05 crc kubenswrapper[4895]: I0320 13:58:05.182121 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-89qqb" event={"ID":"fc89b85a-ad36-4408-a21e-c83299aa045b","Type":"ContainerDied","Data":"a27bfbee2b3c5c0703208332377c8eb906ec671bdfa77a4c03a3d36c6e6ff0ca"} Mar 20 13:58:05 crc kubenswrapper[4895]: I0320 13:58:05.182464 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27bfbee2b3c5c0703208332377c8eb906ec671bdfa77a4c03a3d36c6e6ff0ca" Mar 20 13:58:05 crc kubenswrapper[4895]: I0320 13:58:05.182205 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-89qqb" Mar 20 13:58:05 crc kubenswrapper[4895]: I0320 13:58:05.682208 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-26czq"] Mar 20 13:58:05 crc kubenswrapper[4895]: I0320 13:58:05.692602 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-26czq"] Mar 20 13:58:07 crc kubenswrapper[4895]: I0320 13:58:07.228240 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a03ef8-3463-4c6f-9415-37069d4bcbc9" path="/var/lib/kubelet/pods/b2a03ef8-3463-4c6f-9415-37069d4bcbc9/volumes" Mar 20 13:58:19 crc kubenswrapper[4895]: I0320 13:58:19.401167 4895 scope.go:117] "RemoveContainer" containerID="9410c0ef5938c4d4069e870ba2f308e075f32e88d48b3fd28fa000eb9145e5f6" Mar 20 13:58:52 crc kubenswrapper[4895]: I0320 13:58:52.296927 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:58:52 crc kubenswrapper[4895]: I0320 13:58:52.297691 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.253931 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:10 crc kubenswrapper[4895]: E0320 13:59:10.254993 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89b85a-ad36-4408-a21e-c83299aa045b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.255009 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89b85a-ad36-4408-a21e-c83299aa045b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.255274 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc89b85a-ad36-4408-a21e-c83299aa045b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.259774 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.278819 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.389949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.390140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.390204 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dggc\" (UniqueName: \"kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.492821 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.492928 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.492969 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dggc\" (UniqueName: \"kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.493372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.493524 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.519082 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dggc\" (UniqueName: \"kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc\") pod \"redhat-marketplace-4vlmq\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:10 crc kubenswrapper[4895]: I0320 13:59:10.582629 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:11 crc kubenswrapper[4895]: I0320 13:59:11.078326 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:11 crc kubenswrapper[4895]: I0320 13:59:11.914461 4895 generic.go:334] "Generic (PLEG): container finished" podID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerID="7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a" exitCode=0 Mar 20 13:59:11 crc kubenswrapper[4895]: I0320 13:59:11.914570 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerDied","Data":"7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a"} Mar 20 13:59:11 crc kubenswrapper[4895]: I0320 13:59:11.914818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerStarted","Data":"0e7afaf8ed85d445a7d1b43c08f1a4a0393bfd63274d07cafe9fe5435afecf9f"} Mar 20 13:59:12 crc kubenswrapper[4895]: I0320 13:59:12.925607 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerStarted","Data":"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f"} Mar 20 13:59:13 crc kubenswrapper[4895]: I0320 13:59:13.936366 4895 generic.go:334] "Generic (PLEG): container finished" podID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerID="277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f" exitCode=0 Mar 20 13:59:13 crc kubenswrapper[4895]: I0320 13:59:13.936420 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerDied","Data":"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f"} Mar 20 13:59:14 crc kubenswrapper[4895]: I0320 13:59:14.950848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerStarted","Data":"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936"} Mar 20 13:59:14 crc kubenswrapper[4895]: I0320 13:59:14.971148 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4vlmq" podStartSLOduration=2.523227189 podStartE2EDuration="4.971128201s" podCreationTimestamp="2026-03-20 13:59:10 +0000 UTC" firstStartedPulling="2026-03-20 13:59:11.91711118 +0000 UTC m=+2251.426830146" lastFinishedPulling="2026-03-20 13:59:14.365012172 +0000 UTC m=+2253.874731158" observedRunningTime="2026-03-20 13:59:14.968494717 +0000 UTC m=+2254.478213723" watchObservedRunningTime="2026-03-20 13:59:14.971128201 +0000 UTC m=+2254.480847167" Mar 20 13:59:20 crc kubenswrapper[4895]: I0320 13:59:20.583109 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:20 crc kubenswrapper[4895]: I0320 13:59:20.583690 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:20 crc kubenswrapper[4895]: I0320 13:59:20.626780 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:21 crc kubenswrapper[4895]: I0320 13:59:21.097125 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:21 crc kubenswrapper[4895]: I0320 13:59:21.162621 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:22 crc kubenswrapper[4895]: I0320 13:59:22.297133 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:22 crc kubenswrapper[4895]: I0320 13:59:22.297536 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.031987 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4vlmq" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="registry-server" containerID="cri-o://cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936" gracePeriod=2 Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.614971 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.767914 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities\") pod \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.768017 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content\") pod \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.768073 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dggc\" (UniqueName: \"kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc\") pod \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\" (UID: \"db08aba8-c13a-4e50-ae28-159ac29d3f4b\") " Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.768985 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities" (OuterVolumeSpecName: "utilities") pod "db08aba8-c13a-4e50-ae28-159ac29d3f4b" (UID: "db08aba8-c13a-4e50-ae28-159ac29d3f4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.774592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc" (OuterVolumeSpecName: "kube-api-access-7dggc") pod "db08aba8-c13a-4e50-ae28-159ac29d3f4b" (UID: "db08aba8-c13a-4e50-ae28-159ac29d3f4b"). InnerVolumeSpecName "kube-api-access-7dggc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.795721 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db08aba8-c13a-4e50-ae28-159ac29d3f4b" (UID: "db08aba8-c13a-4e50-ae28-159ac29d3f4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.871145 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.871191 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db08aba8-c13a-4e50-ae28-159ac29d3f4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:23 crc kubenswrapper[4895]: I0320 13:59:23.871214 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dggc\" (UniqueName: \"kubernetes.io/projected/db08aba8-c13a-4e50-ae28-159ac29d3f4b-kube-api-access-7dggc\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.043516 4895 generic.go:334] "Generic (PLEG): container finished" podID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerID="cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936" exitCode=0 Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.043587 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4vlmq" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.043633 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerDied","Data":"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936"} Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.044014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4vlmq" event={"ID":"db08aba8-c13a-4e50-ae28-159ac29d3f4b","Type":"ContainerDied","Data":"0e7afaf8ed85d445a7d1b43c08f1a4a0393bfd63274d07cafe9fe5435afecf9f"} Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.044055 4895 scope.go:117] "RemoveContainer" containerID="cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.075834 4895 scope.go:117] "RemoveContainer" containerID="277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.091622 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.102611 4895 scope.go:117] "RemoveContainer" containerID="7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.103174 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4vlmq"] Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.145703 4895 scope.go:117] "RemoveContainer" containerID="cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936" Mar 20 13:59:24 crc kubenswrapper[4895]: E0320 13:59:24.146128 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936\": container with ID starting with cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936 not found: ID does not exist" containerID="cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.146166 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936"} err="failed to get container status \"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936\": rpc error: code = NotFound desc = could not find container \"cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936\": container with ID starting with cbf37f43a5a172aecae9d185ac2874736f3dc0e3ec807ac29055c4955e69a936 not found: ID does not exist" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.146191 4895 scope.go:117] "RemoveContainer" containerID="277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f" Mar 20 13:59:24 crc kubenswrapper[4895]: E0320 13:59:24.146475 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f\": container with ID starting with 277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f not found: ID does not exist" containerID="277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.146510 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f"} err="failed to get container status \"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f\": rpc error: code = NotFound desc = could not find container \"277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f\": container with ID starting with 277c33c4b025ebbf6c4f3521bedf171b95d55d82f3ed709a1552449302e4134f not found: ID does not exist" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.146534 4895 scope.go:117] "RemoveContainer" containerID="7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a" Mar 20 13:59:24 crc kubenswrapper[4895]: E0320 13:59:24.146893 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a\": container with ID starting with 7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a not found: ID does not exist" containerID="7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a" Mar 20 13:59:24 crc kubenswrapper[4895]: I0320 13:59:24.146933 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a"} err="failed to get container status \"7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a\": rpc error: code = NotFound desc = could not find container \"7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a\": container with ID starting with 7e692fe90cf3b7a7c5560e4b1a6054045192d9df1e4c4ed96e684b79ccb7da5a not found: ID does not exist" Mar 20 13:59:25 crc kubenswrapper[4895]: I0320 13:59:25.225967 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" path="/var/lib/kubelet/pods/db08aba8-c13a-4e50-ae28-159ac29d3f4b/volumes" Mar 20 13:59:52 crc kubenswrapper[4895]: I0320 13:59:52.297313 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:52 crc kubenswrapper[4895]: I0320 13:59:52.298535 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:52 crc kubenswrapper[4895]: I0320 13:59:52.298628 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 13:59:52 crc kubenswrapper[4895]: I0320 13:59:52.299880 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:59:52 crc kubenswrapper[4895]: I0320 13:59:52.299987 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" gracePeriod=600 Mar 20 13:59:52 crc kubenswrapper[4895]: E0320 13:59:52.440994 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 13:59:53 crc kubenswrapper[4895]: I0320 13:59:53.334950 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" exitCode=0 Mar 20 13:59:53 crc kubenswrapper[4895]: I0320 13:59:53.335018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8"} Mar 20 13:59:53 crc kubenswrapper[4895]: I0320 13:59:53.335091 4895 scope.go:117] "RemoveContainer" containerID="156fc0bc3662ccbc251c9b9f1612ec730d5a2cf8bb48fa0ce5fe26d575210569" Mar 20 13:59:53 crc kubenswrapper[4895]: I0320 13:59:53.336267 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 13:59:53 crc kubenswrapper[4895]: E0320 13:59:53.337030 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.163759 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566920-ld29g"] Mar 20 14:00:00 crc kubenswrapper[4895]: E0320 14:00:00.164739 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.164753 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4895]: E0320 14:00:00.164765 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.164771 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4895]: E0320 14:00:00.164797 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.164804 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.165001 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="db08aba8-c13a-4e50-ae28-159ac29d3f4b" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.165882 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.168506 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.170088 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.172721 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.180252 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x"] Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.181832 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.184543 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.184774 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.195778 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-ld29g"] Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.208814 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x"] Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.343914 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.344007 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhkc\" (UniqueName: \"kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc\") pod \"auto-csr-approver-29566920-ld29g\" (UID: \"dc1e628a-ef6d-4769-a6ab-77ba414689e3\") " pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.344535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfv4n\" (UniqueName: \"kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.344607 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.446909 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhkc\" (UniqueName: \"kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc\") pod \"auto-csr-approver-29566920-ld29g\" (UID: \"dc1e628a-ef6d-4769-a6ab-77ba414689e3\") " pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.447147 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfv4n\" (UniqueName: \"kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.447185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.447267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.448460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.462415 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.465841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfv4n\" (UniqueName: \"kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n\") pod \"collect-profiles-29566920-v4q6x\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.473183 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhkc\" (UniqueName: \"kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc\") pod \"auto-csr-approver-29566920-ld29g\" (UID: \"dc1e628a-ef6d-4769-a6ab-77ba414689e3\") " pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.489037 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.518894 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:00 crc kubenswrapper[4895]: I0320 14:00:00.986293 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-ld29g"] Mar 20 14:00:01 crc kubenswrapper[4895]: I0320 14:00:01.051057 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x"] Mar 20 14:00:01 crc kubenswrapper[4895]: W0320 14:00:01.051840 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3de4d5d0_e193_4c77_b93d_bb677e3cfc7a.slice/crio-2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380 WatchSource:0}: Error finding container 2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380: Status 404 returned error can't find the container with id 2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380 Mar 20 14:00:01 crc kubenswrapper[4895]: I0320 14:00:01.427997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" event={"ID":"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a","Type":"ContainerStarted","Data":"fcd71d0b68095e12a5804149affdef1b9ae7dbfb87e9ce139cc8f2122ac8d986"} Mar 20 14:00:01 crc kubenswrapper[4895]: I0320 14:00:01.428034 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" event={"ID":"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a","Type":"ContainerStarted","Data":"2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380"} Mar 20 14:00:01 crc kubenswrapper[4895]: I0320 14:00:01.429912 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-ld29g" event={"ID":"dc1e628a-ef6d-4769-a6ab-77ba414689e3","Type":"ContainerStarted","Data":"2931ebf40a3a2ddf27fb9c6e94cf40976d5989fe5856a97746dc96389f466502"} Mar 20 14:00:01 crc kubenswrapper[4895]: I0320 14:00:01.452052 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" podStartSLOduration=1.452032481 podStartE2EDuration="1.452032481s" podCreationTimestamp="2026-03-20 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:00:01.448525835 +0000 UTC m=+2300.958244811" watchObservedRunningTime="2026-03-20 14:00:01.452032481 +0000 UTC m=+2300.961751447" Mar 20 14:00:02 crc kubenswrapper[4895]: I0320 14:00:02.440837 4895 generic.go:334] "Generic (PLEG): container finished" podID="3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" containerID="fcd71d0b68095e12a5804149affdef1b9ae7dbfb87e9ce139cc8f2122ac8d986" exitCode=0 Mar 20 14:00:02 crc kubenswrapper[4895]: I0320 14:00:02.440917 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" event={"ID":"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a","Type":"ContainerDied","Data":"fcd71d0b68095e12a5804149affdef1b9ae7dbfb87e9ce139cc8f2122ac8d986"} Mar 20 14:00:03 crc kubenswrapper[4895]: I0320 14:00:03.877870 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.029464 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfv4n\" (UniqueName: \"kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n\") pod \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.029963 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume\") pod \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.030053 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume\") pod \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\" (UID: \"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a\") " Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.030704 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" (UID: "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.036522 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" (UID: "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.037134 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n" (OuterVolumeSpecName: "kube-api-access-mfv4n") pod "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" (UID: "3de4d5d0-e193-4c77-b93d-bb677e3cfc7a"). InnerVolumeSpecName "kube-api-access-mfv4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.133074 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.133286 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.133594 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfv4n\" (UniqueName: \"kubernetes.io/projected/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a-kube-api-access-mfv4n\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.338746 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h"] Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.349609 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-cvq2h"] Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.466353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" event={"ID":"3de4d5d0-e193-4c77-b93d-bb677e3cfc7a","Type":"ContainerDied","Data":"2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380"} Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.466603 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2d9c597442bade95a1c3f1dce534d6dc8e5866d8284d29f62a08696f117380" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.466419 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x" Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.468123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-ld29g" event={"ID":"dc1e628a-ef6d-4769-a6ab-77ba414689e3","Type":"ContainerStarted","Data":"1a78a8aadac52a1aabb49b4d64c647cdd504c41ea3e849df2d7a019355d21843"} Mar 20 14:00:04 crc kubenswrapper[4895]: I0320 14:00:04.484873 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566920-ld29g" podStartSLOduration=1.357293151 podStartE2EDuration="4.484856884s" podCreationTimestamp="2026-03-20 14:00:00 +0000 UTC" firstStartedPulling="2026-03-20 14:00:00.981570485 +0000 UTC m=+2300.491289451" lastFinishedPulling="2026-03-20 14:00:04.109134218 +0000 UTC m=+2303.618853184" observedRunningTime="2026-03-20 14:00:04.4802245 +0000 UTC m=+2303.989943466" watchObservedRunningTime="2026-03-20 14:00:04.484856884 +0000 UTC m=+2303.994575850" Mar 20 14:00:05 crc kubenswrapper[4895]: I0320 14:00:05.250832 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444aebdc-d867-44b7-9884-e0d89fea57d8" path="/var/lib/kubelet/pods/444aebdc-d867-44b7-9884-e0d89fea57d8/volumes" Mar 20 14:00:05 crc kubenswrapper[4895]: I0320 14:00:05.485584 4895 generic.go:334] "Generic (PLEG): container finished" podID="dc1e628a-ef6d-4769-a6ab-77ba414689e3" containerID="1a78a8aadac52a1aabb49b4d64c647cdd504c41ea3e849df2d7a019355d21843" exitCode=0 Mar 20 14:00:05 crc kubenswrapper[4895]: I0320 14:00:05.485682 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-ld29g" event={"ID":"dc1e628a-ef6d-4769-a6ab-77ba414689e3","Type":"ContainerDied","Data":"1a78a8aadac52a1aabb49b4d64c647cdd504c41ea3e849df2d7a019355d21843"} Mar 20 14:00:06 crc kubenswrapper[4895]: I0320 14:00:06.879527 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:06 crc kubenswrapper[4895]: I0320 14:00:06.905039 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhkc\" (UniqueName: \"kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc\") pod \"dc1e628a-ef6d-4769-a6ab-77ba414689e3\" (UID: \"dc1e628a-ef6d-4769-a6ab-77ba414689e3\") " Mar 20 14:00:06 crc kubenswrapper[4895]: I0320 14:00:06.916836 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc" (OuterVolumeSpecName: "kube-api-access-zjhkc") pod "dc1e628a-ef6d-4769-a6ab-77ba414689e3" (UID: "dc1e628a-ef6d-4769-a6ab-77ba414689e3"). InnerVolumeSpecName "kube-api-access-zjhkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.008063 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjhkc\" (UniqueName: \"kubernetes.io/projected/dc1e628a-ef6d-4769-a6ab-77ba414689e3-kube-api-access-zjhkc\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.516235 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-ld29g" event={"ID":"dc1e628a-ef6d-4769-a6ab-77ba414689e3","Type":"ContainerDied","Data":"2931ebf40a3a2ddf27fb9c6e94cf40976d5989fe5856a97746dc96389f466502"} Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.516621 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2931ebf40a3a2ddf27fb9c6e94cf40976d5989fe5856a97746dc96389f466502" Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.516295 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-ld29g" Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.553977 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-trnlb"] Mar 20 14:00:07 crc kubenswrapper[4895]: I0320 14:00:07.572038 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-trnlb"] Mar 20 14:00:08 crc kubenswrapper[4895]: I0320 14:00:08.212113 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:00:08 crc kubenswrapper[4895]: E0320 14:00:08.212641 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:00:09 crc kubenswrapper[4895]: I0320 14:00:09.243178 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e" path="/var/lib/kubelet/pods/25861fd5-ca0e-4822-9cc7-e8b3e53b5d4e/volumes" Mar 20 14:00:19 crc kubenswrapper[4895]: I0320 14:00:19.500569 4895 scope.go:117] "RemoveContainer" containerID="99176121a628006e8cd09e7d6a983e4142f60503ab30551b920724996ab58187" Mar 20 14:00:19 crc kubenswrapper[4895]: I0320 14:00:19.551060 4895 scope.go:117] "RemoveContainer" containerID="c51ce63f69819dd27c1a17f9fc4873f38376b68cef2652af8006221a827ab441" Mar 20 14:00:21 crc kubenswrapper[4895]: I0320 14:00:21.836996 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:00:21 crc kubenswrapper[4895]: E0320 14:00:21.838704 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:00:35 crc kubenswrapper[4895]: I0320 14:00:35.211932 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:00:35 crc kubenswrapper[4895]: E0320 14:00:35.212692 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:00:48 crc kubenswrapper[4895]: I0320 14:00:48.211614 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:00:48 crc kubenswrapper[4895]: E0320 14:00:48.212261 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.154514 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566921-87zd9"] Mar 20 14:01:00 crc kubenswrapper[4895]: E0320 14:01:00.155618 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1e628a-ef6d-4769-a6ab-77ba414689e3" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.155633 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1e628a-ef6d-4769-a6ab-77ba414689e3" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4895]: E0320 14:01:00.155680 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.155688 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.155886 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1e628a-ef6d-4769-a6ab-77ba414689e3" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.155923 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.156715 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.187229 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-87zd9"] Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.242862 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.243019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.243046 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.243103 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn657\" (UniqueName: \"kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.344837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn657\" (UniqueName: \"kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.344950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.345022 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.345045 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.351237 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.352810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.356694 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.369352 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn657\" (UniqueName: \"kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657\") pod \"keystone-cron-29566921-87zd9\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.483062 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:00 crc kubenswrapper[4895]: I0320 14:01:00.983078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-87zd9"] Mar 20 14:01:01 crc kubenswrapper[4895]: I0320 14:01:01.235244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-87zd9" event={"ID":"f4cdaa4c-2563-4f80-924b-33f19fd8a099","Type":"ContainerStarted","Data":"4eae8703afb6059d4d1dd850d350ec425586820ebcbc0ee8fb9b324bc95bd3ae"} Mar 20 14:01:01 crc kubenswrapper[4895]: I0320 14:01:01.235289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-87zd9" event={"ID":"f4cdaa4c-2563-4f80-924b-33f19fd8a099","Type":"ContainerStarted","Data":"4445ae2c69023526774d39881c6f04a0593a17b9cae575c3226af2cdad159b3e"} Mar 20 14:01:02 crc kubenswrapper[4895]: I0320 14:01:02.264447 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566921-87zd9" podStartSLOduration=2.264421575 podStartE2EDuration="2.264421575s" podCreationTimestamp="2026-03-20 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:01:02.260308554 +0000 UTC m=+2361.770027520" watchObservedRunningTime="2026-03-20 14:01:02.264421575 +0000 UTC m=+2361.774140541" Mar 20 14:01:03 crc kubenswrapper[4895]: I0320 14:01:03.212861 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:01:03 crc kubenswrapper[4895]: E0320 14:01:03.213578 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:01:06 crc kubenswrapper[4895]: I0320 14:01:06.283501 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4cdaa4c-2563-4f80-924b-33f19fd8a099" containerID="4eae8703afb6059d4d1dd850d350ec425586820ebcbc0ee8fb9b324bc95bd3ae" exitCode=0 Mar 20 14:01:06 crc kubenswrapper[4895]: I0320 14:01:06.283563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-87zd9" event={"ID":"f4cdaa4c-2563-4f80-924b-33f19fd8a099","Type":"ContainerDied","Data":"4eae8703afb6059d4d1dd850d350ec425586820ebcbc0ee8fb9b324bc95bd3ae"} Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.706149 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.853543 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data\") pod \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.854059 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn657\" (UniqueName: \"kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657\") pod \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.854182 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle\") pod \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.854206 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys\") pod \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\" (UID: \"f4cdaa4c-2563-4f80-924b-33f19fd8a099\") " Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.858935 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f4cdaa4c-2563-4f80-924b-33f19fd8a099" (UID: "f4cdaa4c-2563-4f80-924b-33f19fd8a099"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.892621 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657" (OuterVolumeSpecName: "kube-api-access-wn657") pod "f4cdaa4c-2563-4f80-924b-33f19fd8a099" (UID: "f4cdaa4c-2563-4f80-924b-33f19fd8a099"). InnerVolumeSpecName "kube-api-access-wn657". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.958272 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn657\" (UniqueName: \"kubernetes.io/projected/f4cdaa4c-2563-4f80-924b-33f19fd8a099-kube-api-access-wn657\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.958324 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:07.970035 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4cdaa4c-2563-4f80-924b-33f19fd8a099" (UID: "f4cdaa4c-2563-4f80-924b-33f19fd8a099"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.006517 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data" (OuterVolumeSpecName: "config-data") pod "f4cdaa4c-2563-4f80-924b-33f19fd8a099" (UID: "f4cdaa4c-2563-4f80-924b-33f19fd8a099"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.060757 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.060783 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4cdaa4c-2563-4f80-924b-33f19fd8a099-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.307442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-87zd9" event={"ID":"f4cdaa4c-2563-4f80-924b-33f19fd8a099","Type":"ContainerDied","Data":"4445ae2c69023526774d39881c6f04a0593a17b9cae575c3226af2cdad159b3e"} Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.307484 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4445ae2c69023526774d39881c6f04a0593a17b9cae575c3226af2cdad159b3e" Mar 20 14:01:08 crc kubenswrapper[4895]: I0320 14:01:08.307482 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-87zd9" Mar 20 14:01:15 crc kubenswrapper[4895]: I0320 14:01:15.211939 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:01:15 crc kubenswrapper[4895]: E0320 14:01:15.212661 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:01:26 crc kubenswrapper[4895]: I0320 14:01:26.211768 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:01:26 crc kubenswrapper[4895]: E0320 14:01:26.213646 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:01:26 crc kubenswrapper[4895]: I0320 14:01:26.487573 4895 generic.go:334] "Generic (PLEG): container finished" podID="584873be-8282-406d-9a6a-2abb61f6d3bd" containerID="e597e8d3a6aaf2de1d035815a641c2ffa27d7291c179e69b59675413d2c4b8ee" exitCode=0 Mar 20 14:01:26 crc kubenswrapper[4895]: I0320 14:01:26.487624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" event={"ID":"584873be-8282-406d-9a6a-2abb61f6d3bd","Type":"ContainerDied","Data":"e597e8d3a6aaf2de1d035815a641c2ffa27d7291c179e69b59675413d2c4b8ee"} Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.063462 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.176139 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle\") pod \"584873be-8282-406d-9a6a-2abb61f6d3bd\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.176823 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54fwl\" (UniqueName: \"kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl\") pod \"584873be-8282-406d-9a6a-2abb61f6d3bd\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.176992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam\") pod \"584873be-8282-406d-9a6a-2abb61f6d3bd\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.177091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory\") pod \"584873be-8282-406d-9a6a-2abb61f6d3bd\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.177240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0\") pod \"584873be-8282-406d-9a6a-2abb61f6d3bd\" (UID: \"584873be-8282-406d-9a6a-2abb61f6d3bd\") " Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.181850 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "584873be-8282-406d-9a6a-2abb61f6d3bd" (UID: "584873be-8282-406d-9a6a-2abb61f6d3bd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.183616 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl" (OuterVolumeSpecName: "kube-api-access-54fwl") pod "584873be-8282-406d-9a6a-2abb61f6d3bd" (UID: "584873be-8282-406d-9a6a-2abb61f6d3bd"). InnerVolumeSpecName "kube-api-access-54fwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.204915 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory" (OuterVolumeSpecName: "inventory") pod "584873be-8282-406d-9a6a-2abb61f6d3bd" (UID: "584873be-8282-406d-9a6a-2abb61f6d3bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.207107 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "584873be-8282-406d-9a6a-2abb61f6d3bd" (UID: "584873be-8282-406d-9a6a-2abb61f6d3bd"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.225617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "584873be-8282-406d-9a6a-2abb61f6d3bd" (UID: "584873be-8282-406d-9a6a-2abb61f6d3bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.280437 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.280467 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.280479 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.280488 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584873be-8282-406d-9a6a-2abb61f6d3bd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.280497 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54fwl\" (UniqueName: \"kubernetes.io/projected/584873be-8282-406d-9a6a-2abb61f6d3bd-kube-api-access-54fwl\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.510993 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" event={"ID":"584873be-8282-406d-9a6a-2abb61f6d3bd","Type":"ContainerDied","Data":"b450fe6a4bc3165461cf259d8efd8c6adf38d5c573047400ac842ae20d54cb97"} Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.511227 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b450fe6a4bc3165461cf259d8efd8c6adf38d5c573047400ac842ae20d54cb97" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.511423 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.603706 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr"] Mar 20 14:01:28 crc kubenswrapper[4895]: E0320 14:01:28.604136 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cdaa4c-2563-4f80-924b-33f19fd8a099" containerName="keystone-cron" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.604158 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cdaa4c-2563-4f80-924b-33f19fd8a099" containerName="keystone-cron" Mar 20 14:01:28 crc kubenswrapper[4895]: E0320 14:01:28.604183 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584873be-8282-406d-9a6a-2abb61f6d3bd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.604191 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="584873be-8282-406d-9a6a-2abb61f6d3bd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.606772 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="584873be-8282-406d-9a6a-2abb61f6d3bd" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.606814 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cdaa4c-2563-4f80-924b-33f19fd8a099" containerName="keystone-cron" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.607664 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.611928 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612187 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612273 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612316 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612340 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612370 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.612578 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.617720 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr"] Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.627717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.627777 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jts5g\" (UniqueName: \"kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.627939 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.627993 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628013 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628076 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628501 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628629 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.628822 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731061 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731206 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731247 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jts5g\" (UniqueName: \"kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731349 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731400 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731444 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731475 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.731525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.732354 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.735640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.736366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.736661 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.738011 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.738292 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.743168 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.745129 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.748009 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.751024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.752637 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jts5g\" (UniqueName: \"kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5v4cr\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:28 crc kubenswrapper[4895]: I0320 14:01:28.932520 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:01:29 crc kubenswrapper[4895]: I0320 14:01:29.525485 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr"] Mar 20 14:01:30 crc kubenswrapper[4895]: I0320 14:01:30.531076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" event={"ID":"dabc07a2-735b-409d-826e-9f4877cc40fe","Type":"ContainerStarted","Data":"acc355fc002c042ef9cba7c445f7d3ea79bf9e5a78a25214a71c93522c19e2de"} Mar 20 14:01:30 crc kubenswrapper[4895]: I0320 14:01:30.531330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" event={"ID":"dabc07a2-735b-409d-826e-9f4877cc40fe","Type":"ContainerStarted","Data":"ac29189518e25f7fda8c0bf63b3ac1fb932b6c2e644f4a212241130c631e9ed0"} Mar 20 14:01:30 crc kubenswrapper[4895]: I0320 14:01:30.555933 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" podStartSLOduration=2.121423746 podStartE2EDuration="2.55591476s" podCreationTimestamp="2026-03-20 14:01:28 +0000 UTC" firstStartedPulling="2026-03-20 14:01:29.529549825 +0000 UTC m=+2389.039268791" lastFinishedPulling="2026-03-20 14:01:29.964040839 +0000 UTC m=+2389.473759805" observedRunningTime="2026-03-20 14:01:30.551698417 +0000 UTC m=+2390.061417383" watchObservedRunningTime="2026-03-20 14:01:30.55591476 +0000 UTC m=+2390.065633736" Mar 20 14:01:39 crc kubenswrapper[4895]: I0320 14:01:39.212067 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:01:39 crc kubenswrapper[4895]: E0320 14:01:39.213151 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:01:53 crc kubenswrapper[4895]: I0320 14:01:53.213314 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:01:53 crc kubenswrapper[4895]: E0320 14:01:53.214577 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.153283 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566922-wrp5k"] Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.155694 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.157878 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.157919 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.157992 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.165000 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-wrp5k"] Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.250945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqng\" (UniqueName: \"kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng\") pod \"auto-csr-approver-29566922-wrp5k\" (UID: \"b11e21c6-22cf-486b-81fc-7aeb0d1aa329\") " pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.352959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqng\" (UniqueName: \"kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng\") pod \"auto-csr-approver-29566922-wrp5k\" (UID: \"b11e21c6-22cf-486b-81fc-7aeb0d1aa329\") " pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.390575 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqng\" (UniqueName: \"kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng\") pod \"auto-csr-approver-29566922-wrp5k\" (UID: \"b11e21c6-22cf-486b-81fc-7aeb0d1aa329\") " pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.478831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:00 crc kubenswrapper[4895]: I0320 14:02:00.925720 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-wrp5k"] Mar 20 14:02:01 crc kubenswrapper[4895]: I0320 14:02:01.839056 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" event={"ID":"b11e21c6-22cf-486b-81fc-7aeb0d1aa329","Type":"ContainerStarted","Data":"2dcd45d7b4cfb6378d92b32de9a71d38572b6088a6753072e31e49c72ca0161d"} Mar 20 14:02:02 crc kubenswrapper[4895]: I0320 14:02:02.849040 4895 generic.go:334] "Generic (PLEG): container finished" podID="b11e21c6-22cf-486b-81fc-7aeb0d1aa329" containerID="529c2e70aec01c059701877c289186d4c3f67e07f3fc3c173b2cba3ac1e8e893" exitCode=0 Mar 20 14:02:02 crc kubenswrapper[4895]: I0320 14:02:02.849138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" event={"ID":"b11e21c6-22cf-486b-81fc-7aeb0d1aa329","Type":"ContainerDied","Data":"529c2e70aec01c059701877c289186d4c3f67e07f3fc3c173b2cba3ac1e8e893"} Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.258317 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.340983 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqng\" (UniqueName: \"kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng\") pod \"b11e21c6-22cf-486b-81fc-7aeb0d1aa329\" (UID: \"b11e21c6-22cf-486b-81fc-7aeb0d1aa329\") " Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.346543 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng" (OuterVolumeSpecName: "kube-api-access-sgqng") pod "b11e21c6-22cf-486b-81fc-7aeb0d1aa329" (UID: "b11e21c6-22cf-486b-81fc-7aeb0d1aa329"). InnerVolumeSpecName "kube-api-access-sgqng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.443036 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqng\" (UniqueName: \"kubernetes.io/projected/b11e21c6-22cf-486b-81fc-7aeb0d1aa329-kube-api-access-sgqng\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.873153 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" event={"ID":"b11e21c6-22cf-486b-81fc-7aeb0d1aa329","Type":"ContainerDied","Data":"2dcd45d7b4cfb6378d92b32de9a71d38572b6088a6753072e31e49c72ca0161d"} Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.873216 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcd45d7b4cfb6378d92b32de9a71d38572b6088a6753072e31e49c72ca0161d" Mar 20 14:02:04 crc kubenswrapper[4895]: I0320 14:02:04.873215 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-wrp5k" Mar 20 14:02:05 crc kubenswrapper[4895]: I0320 14:02:05.334463 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-pphbn"] Mar 20 14:02:05 crc kubenswrapper[4895]: I0320 14:02:05.343333 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-pphbn"] Mar 20 14:02:07 crc kubenswrapper[4895]: I0320 14:02:07.212193 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:02:07 crc kubenswrapper[4895]: E0320 14:02:07.213065 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:02:07 crc kubenswrapper[4895]: I0320 14:02:07.224846 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b1bb13-ebec-4961-afb7-c14cba59bd99" path="/var/lib/kubelet/pods/73b1bb13-ebec-4961-afb7-c14cba59bd99/volumes" Mar 20 14:02:19 crc kubenswrapper[4895]: I0320 14:02:19.707783 4895 scope.go:117] "RemoveContainer" containerID="e1c43fa8ff8a386234aeab037db15eb7c5d66576c5986243a27b7f5bc3c4c444" Mar 20 14:02:20 crc kubenswrapper[4895]: I0320 14:02:20.211894 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:02:20 crc kubenswrapper[4895]: E0320 14:02:20.212340 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:02:31 crc kubenswrapper[4895]: I0320 14:02:31.222354 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:02:31 crc kubenswrapper[4895]: E0320 14:02:31.223301 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:02:43 crc kubenswrapper[4895]: I0320 14:02:43.212531 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:02:43 crc kubenswrapper[4895]: E0320 14:02:43.213335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:02:56 crc kubenswrapper[4895]: I0320 14:02:56.212283 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:02:56 crc kubenswrapper[4895]: E0320 14:02:56.213207 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:03:07 crc kubenswrapper[4895]: I0320 14:03:07.212891 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:03:07 crc kubenswrapper[4895]: E0320 14:03:07.213794 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:03:19 crc kubenswrapper[4895]: I0320 14:03:19.211444 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:03:19 crc kubenswrapper[4895]: E0320 14:03:19.212299 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:03:30 crc kubenswrapper[4895]: I0320 14:03:30.212588 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:03:30 crc kubenswrapper[4895]: E0320 14:03:30.213683 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:03:41 crc kubenswrapper[4895]: I0320 14:03:41.952834 4895 generic.go:334] "Generic (PLEG): container finished" podID="dabc07a2-735b-409d-826e-9f4877cc40fe" containerID="acc355fc002c042ef9cba7c445f7d3ea79bf9e5a78a25214a71c93522c19e2de" exitCode=0 Mar 20 14:03:41 crc kubenswrapper[4895]: I0320 14:03:41.952920 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" event={"ID":"dabc07a2-735b-409d-826e-9f4877cc40fe","Type":"ContainerDied","Data":"acc355fc002c042ef9cba7c445f7d3ea79bf9e5a78a25214a71c93522c19e2de"} Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.212612 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:03:43 crc kubenswrapper[4895]: E0320 14:03:43.213175 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.449627 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.595917 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jts5g\" (UniqueName: \"kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596056 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596097 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596177 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596215 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596550 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.596586 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2\") pod \"dabc07a2-735b-409d-826e-9f4877cc40fe\" (UID: \"dabc07a2-735b-409d-826e-9f4877cc40fe\") " Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.602803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g" (OuterVolumeSpecName: "kube-api-access-jts5g") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "kube-api-access-jts5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.602912 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.634933 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.636560 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.637369 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.637373 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.637833 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory" (OuterVolumeSpecName: "inventory") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.639556 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.640416 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.646363 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.653554 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dabc07a2-735b-409d-826e-9f4877cc40fe" (UID: "dabc07a2-735b-409d-826e-9f4877cc40fe"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699762 4895 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699793 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699804 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699818 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699827 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699837 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699847 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699856 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699866 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jts5g\" (UniqueName: \"kubernetes.io/projected/dabc07a2-735b-409d-826e-9f4877cc40fe-kube-api-access-jts5g\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699874 4895 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.699883 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dabc07a2-735b-409d-826e-9f4877cc40fe-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.979087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" event={"ID":"dabc07a2-735b-409d-826e-9f4877cc40fe","Type":"ContainerDied","Data":"ac29189518e25f7fda8c0bf63b3ac1fb932b6c2e644f4a212241130c631e9ed0"} Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.979133 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac29189518e25f7fda8c0bf63b3ac1fb932b6c2e644f4a212241130c631e9ed0" Mar 20 14:03:43 crc kubenswrapper[4895]: I0320 14:03:43.979149 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5v4cr" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.104565 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv"] Mar 20 14:03:44 crc kubenswrapper[4895]: E0320 14:03:44.105054 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabc07a2-735b-409d-826e-9f4877cc40fe" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.105103 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabc07a2-735b-409d-826e-9f4877cc40fe" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:44 crc kubenswrapper[4895]: E0320 14:03:44.105145 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11e21c6-22cf-486b-81fc-7aeb0d1aa329" containerName="oc" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.105155 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11e21c6-22cf-486b-81fc-7aeb0d1aa329" containerName="oc" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.105428 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabc07a2-735b-409d-826e-9f4877cc40fe" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.105454 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11e21c6-22cf-486b-81fc-7aeb0d1aa329" containerName="oc" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.106337 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.108756 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.109019 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.109166 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.109336 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4r4sh" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.110327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.111986 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv"] Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.212210 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.212476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.212578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nh97\" (UniqueName: \"kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.212853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.212921 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.213022 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.213069 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.314926 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.314984 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nh97\" (UniqueName: \"kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.315105 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.315142 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.315178 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.315202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.315238 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.319794 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.319901 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.319928 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.320852 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.321382 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.324342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.332225 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nh97\" (UniqueName: \"kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.438227 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.960486 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv"] Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.964910 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:03:44 crc kubenswrapper[4895]: I0320 14:03:44.989314 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" event={"ID":"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525","Type":"ContainerStarted","Data":"832737b7035b8c945f4367f2be5bb3cf1d26163126f76cce1ba7e3a12619d934"} Mar 20 14:03:47 crc kubenswrapper[4895]: I0320 14:03:47.012514 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" event={"ID":"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525","Type":"ContainerStarted","Data":"f03a86203a069ee01aad5ec5e9f2a80f8e8ce121db3d4a369497240cf658917a"} Mar 20 14:03:47 crc kubenswrapper[4895]: I0320 14:03:47.034103 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" podStartSLOduration=2.132898999 podStartE2EDuration="3.034081101s" podCreationTimestamp="2026-03-20 14:03:44 +0000 UTC" firstStartedPulling="2026-03-20 14:03:44.964626447 +0000 UTC m=+2524.474345413" lastFinishedPulling="2026-03-20 14:03:45.865808549 +0000 UTC m=+2525.375527515" observedRunningTime="2026-03-20 14:03:47.030022023 +0000 UTC m=+2526.539740989" watchObservedRunningTime="2026-03-20 14:03:47.034081101 +0000 UTC m=+2526.543800107" Mar 20 14:03:57 crc kubenswrapper[4895]: I0320 14:03:57.212145 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:03:57 crc kubenswrapper[4895]: E0320 14:03:57.213247 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.145007 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566924-vhjzn"] Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.147328 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.150122 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.150472 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.151282 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.156720 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-vhjzn"] Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.299704 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhlh\" (UniqueName: \"kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh\") pod \"auto-csr-approver-29566924-vhjzn\" (UID: \"46a3e8f0-a68d-4612-987f-4624c41bf952\") " pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.401777 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhlh\" (UniqueName: \"kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh\") pod \"auto-csr-approver-29566924-vhjzn\" (UID: \"46a3e8f0-a68d-4612-987f-4624c41bf952\") " pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.423579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhlh\" (UniqueName: \"kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh\") pod \"auto-csr-approver-29566924-vhjzn\" (UID: \"46a3e8f0-a68d-4612-987f-4624c41bf952\") " pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.467834 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:00 crc kubenswrapper[4895]: W0320 14:04:00.935304 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a3e8f0_a68d_4612_987f_4624c41bf952.slice/crio-79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b WatchSource:0}: Error finding container 79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b: Status 404 returned error can't find the container with id 79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b Mar 20 14:04:00 crc kubenswrapper[4895]: I0320 14:04:00.936416 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-vhjzn"] Mar 20 14:04:01 crc kubenswrapper[4895]: I0320 14:04:01.201181 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" event={"ID":"46a3e8f0-a68d-4612-987f-4624c41bf952","Type":"ContainerStarted","Data":"79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b"} Mar 20 14:04:03 crc kubenswrapper[4895]: I0320 14:04:03.227593 4895 generic.go:334] "Generic (PLEG): container finished" podID="46a3e8f0-a68d-4612-987f-4624c41bf952" containerID="2decac017c3f4fe1a1099f20d620cd605c539df53f67adddd3f031a42ff252fe" exitCode=0 Mar 20 14:04:03 crc kubenswrapper[4895]: I0320 14:04:03.227669 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" event={"ID":"46a3e8f0-a68d-4612-987f-4624c41bf952","Type":"ContainerDied","Data":"2decac017c3f4fe1a1099f20d620cd605c539df53f67adddd3f031a42ff252fe"} Mar 20 14:04:04 crc kubenswrapper[4895]: I0320 14:04:04.644213 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:04 crc kubenswrapper[4895]: I0320 14:04:04.747642 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhhlh\" (UniqueName: \"kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh\") pod \"46a3e8f0-a68d-4612-987f-4624c41bf952\" (UID: \"46a3e8f0-a68d-4612-987f-4624c41bf952\") " Mar 20 14:04:04 crc kubenswrapper[4895]: I0320 14:04:04.753042 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh" (OuterVolumeSpecName: "kube-api-access-mhhlh") pod "46a3e8f0-a68d-4612-987f-4624c41bf952" (UID: "46a3e8f0-a68d-4612-987f-4624c41bf952"). InnerVolumeSpecName "kube-api-access-mhhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:04:04 crc kubenswrapper[4895]: I0320 14:04:04.849988 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhhlh\" (UniqueName: \"kubernetes.io/projected/46a3e8f0-a68d-4612-987f-4624c41bf952-kube-api-access-mhhlh\") on node \"crc\" DevicePath \"\"" Mar 20 14:04:05 crc kubenswrapper[4895]: I0320 14:04:05.253138 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" Mar 20 14:04:05 crc kubenswrapper[4895]: I0320 14:04:05.253136 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-vhjzn" event={"ID":"46a3e8f0-a68d-4612-987f-4624c41bf952","Type":"ContainerDied","Data":"79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b"} Mar 20 14:04:05 crc kubenswrapper[4895]: I0320 14:04:05.253784 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79a4afe8ad4d6909915d68d8f4004e30893d7b3f960c81cb9e6ee4724ad47e3b" Mar 20 14:04:05 crc kubenswrapper[4895]: I0320 14:04:05.720153 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-89qqb"] Mar 20 14:04:05 crc kubenswrapper[4895]: I0320 14:04:05.730744 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-89qqb"] Mar 20 14:04:07 crc kubenswrapper[4895]: I0320 14:04:07.225509 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc89b85a-ad36-4408-a21e-c83299aa045b" path="/var/lib/kubelet/pods/fc89b85a-ad36-4408-a21e-c83299aa045b/volumes" Mar 20 14:04:09 crc kubenswrapper[4895]: I0320 14:04:09.212184 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:04:09 crc kubenswrapper[4895]: E0320 14:04:09.212625 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:04:19 crc kubenswrapper[4895]: I0320 14:04:19.819089 4895 scope.go:117] "RemoveContainer" containerID="fd880478f1729cc6de52a360937ab6a2315c353b5da97495e3826f4658a5ef00" Mar 20 14:04:22 crc kubenswrapper[4895]: I0320 14:04:22.211742 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:04:22 crc kubenswrapper[4895]: E0320 14:04:22.212785 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:04:35 crc kubenswrapper[4895]: I0320 14:04:35.212579 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:04:35 crc kubenswrapper[4895]: E0320 14:04:35.213552 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:04:46 crc kubenswrapper[4895]: I0320 14:04:46.212449 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:04:46 crc kubenswrapper[4895]: E0320 14:04:46.213248 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:04:59 crc kubenswrapper[4895]: I0320 14:04:59.211747 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:04:59 crc kubenswrapper[4895]: I0320 14:04:59.823981 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55"} Mar 20 14:05:52 crc kubenswrapper[4895]: I0320 14:05:52.350983 4895 generic.go:334] "Generic (PLEG): container finished" podID="a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" containerID="f03a86203a069ee01aad5ec5e9f2a80f8e8ce121db3d4a369497240cf658917a" exitCode=0 Mar 20 14:05:52 crc kubenswrapper[4895]: I0320 14:05:52.351084 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" event={"ID":"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525","Type":"ContainerDied","Data":"f03a86203a069ee01aad5ec5e9f2a80f8e8ce121db3d4a369497240cf658917a"} Mar 20 14:05:53 crc kubenswrapper[4895]: I0320 14:05:53.883483 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.058877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059413 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nh97\" (UniqueName: \"kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059453 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059496 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.059562 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory\") pod \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\" (UID: \"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525\") " Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.065916 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.067244 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97" (OuterVolumeSpecName: "kube-api-access-8nh97") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "kube-api-access-8nh97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.096322 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.097157 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory" (OuterVolumeSpecName: "inventory") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.100867 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.101662 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.110337 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" (UID: "a15bb8dc-2a80-48a7-aa1b-6b0bc8103525"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163090 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163126 4895 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163137 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nh97\" (UniqueName: \"kubernetes.io/projected/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-kube-api-access-8nh97\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163145 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163154 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163163 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.163172 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a15bb8dc-2a80-48a7-aa1b-6b0bc8103525-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.375841 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" event={"ID":"a15bb8dc-2a80-48a7-aa1b-6b0bc8103525","Type":"ContainerDied","Data":"832737b7035b8c945f4367f2be5bb3cf1d26163126f76cce1ba7e3a12619d934"} Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.375901 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832737b7035b8c945f4367f2be5bb3cf1d26163126f76cce1ba7e3a12619d934" Mar 20 14:05:54 crc kubenswrapper[4895]: I0320 14:05:54.375900 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.146520 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566926-xdvdf"] Mar 20 14:06:00 crc kubenswrapper[4895]: E0320 14:06:00.147665 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a3e8f0-a68d-4612-987f-4624c41bf952" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.147682 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a3e8f0-a68d-4612-987f-4624c41bf952" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4895]: E0320 14:06:00.147705 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.147715 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.147929 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a3e8f0-a68d-4612-987f-4624c41bf952" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.147952 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15bb8dc-2a80-48a7-aa1b-6b0bc8103525" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.148751 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.151183 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.151890 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.153634 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.155755 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-xdvdf"] Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.298959 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7tq\" (UniqueName: \"kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq\") pod \"auto-csr-approver-29566926-xdvdf\" (UID: \"76525fa0-1750-46f8-8d3b-04f8feec2f16\") " pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.400568 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7tq\" (UniqueName: \"kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq\") pod \"auto-csr-approver-29566926-xdvdf\" (UID: \"76525fa0-1750-46f8-8d3b-04f8feec2f16\") " pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.423904 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7tq\" (UniqueName: \"kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq\") pod \"auto-csr-approver-29566926-xdvdf\" (UID: \"76525fa0-1750-46f8-8d3b-04f8feec2f16\") " pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.468501 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:00 crc kubenswrapper[4895]: I0320 14:06:00.957008 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-xdvdf"] Mar 20 14:06:01 crc kubenswrapper[4895]: I0320 14:06:01.444748 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" event={"ID":"76525fa0-1750-46f8-8d3b-04f8feec2f16","Type":"ContainerStarted","Data":"499f13cd39a22af19b99f7e3ceff0dd18fffd32dab67f6ae8d79c82801af5f0b"} Mar 20 14:06:03 crc kubenswrapper[4895]: I0320 14:06:03.471101 4895 generic.go:334] "Generic (PLEG): container finished" podID="76525fa0-1750-46f8-8d3b-04f8feec2f16" containerID="b5add1003fac6cb2d034ec31b7a93f851b4e34c1eaea8d3749af2fae04a0153e" exitCode=0 Mar 20 14:06:03 crc kubenswrapper[4895]: I0320 14:06:03.471239 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" event={"ID":"76525fa0-1750-46f8-8d3b-04f8feec2f16","Type":"ContainerDied","Data":"b5add1003fac6cb2d034ec31b7a93f851b4e34c1eaea8d3749af2fae04a0153e"} Mar 20 14:06:04 crc kubenswrapper[4895]: I0320 14:06:04.874089 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:04 crc kubenswrapper[4895]: I0320 14:06:04.996099 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7tq\" (UniqueName: \"kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq\") pod \"76525fa0-1750-46f8-8d3b-04f8feec2f16\" (UID: \"76525fa0-1750-46f8-8d3b-04f8feec2f16\") " Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.002454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq" (OuterVolumeSpecName: "kube-api-access-7v7tq") pod "76525fa0-1750-46f8-8d3b-04f8feec2f16" (UID: "76525fa0-1750-46f8-8d3b-04f8feec2f16"). InnerVolumeSpecName "kube-api-access-7v7tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.098072 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7tq\" (UniqueName: \"kubernetes.io/projected/76525fa0-1750-46f8-8d3b-04f8feec2f16-kube-api-access-7v7tq\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.504210 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" event={"ID":"76525fa0-1750-46f8-8d3b-04f8feec2f16","Type":"ContainerDied","Data":"499f13cd39a22af19b99f7e3ceff0dd18fffd32dab67f6ae8d79c82801af5f0b"} Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.504245 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499f13cd39a22af19b99f7e3ceff0dd18fffd32dab67f6ae8d79c82801af5f0b" Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.504293 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-xdvdf" Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.941953 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-ld29g"] Mar 20 14:06:05 crc kubenswrapper[4895]: I0320 14:06:05.951934 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-ld29g"] Mar 20 14:06:07 crc kubenswrapper[4895]: I0320 14:06:07.222837 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1e628a-ef6d-4769-a6ab-77ba414689e3" path="/var/lib/kubelet/pods/dc1e628a-ef6d-4769-a6ab-77ba414689e3/volumes" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.878029 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f6l6m"] Mar 20 14:06:08 crc kubenswrapper[4895]: E0320 14:06:08.878917 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76525fa0-1750-46f8-8d3b-04f8feec2f16" containerName="oc" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.878936 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="76525fa0-1750-46f8-8d3b-04f8feec2f16" containerName="oc" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.879286 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="76525fa0-1750-46f8-8d3b-04f8feec2f16" containerName="oc" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.881310 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.904262 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6l6m"] Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.982895 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-catalog-content\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.982990 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7hq\" (UniqueName: \"kubernetes.io/projected/27bd4b2f-9705-45e5-b579-04e05fb9abde-kube-api-access-lz7hq\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:08 crc kubenswrapper[4895]: I0320 14:06:08.983140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-utilities\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.084488 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-utilities\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.084566 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-catalog-content\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.084645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7hq\" (UniqueName: \"kubernetes.io/projected/27bd4b2f-9705-45e5-b579-04e05fb9abde-kube-api-access-lz7hq\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.085188 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-utilities\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.085238 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27bd4b2f-9705-45e5-b579-04e05fb9abde-catalog-content\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.103377 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7hq\" (UniqueName: \"kubernetes.io/projected/27bd4b2f-9705-45e5-b579-04e05fb9abde-kube-api-access-lz7hq\") pod \"certified-operators-f6l6m\" (UID: \"27bd4b2f-9705-45e5-b579-04e05fb9abde\") " pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.213757 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:09 crc kubenswrapper[4895]: I0320 14:06:09.808975 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6l6m"] Mar 20 14:06:10 crc kubenswrapper[4895]: I0320 14:06:10.553152 4895 generic.go:334] "Generic (PLEG): container finished" podID="27bd4b2f-9705-45e5-b579-04e05fb9abde" containerID="f04721ddb42e9cd755e7c896173e13f755a78ce831966477cc8477215e4961e1" exitCode=0 Mar 20 14:06:10 crc kubenswrapper[4895]: I0320 14:06:10.553231 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l6m" event={"ID":"27bd4b2f-9705-45e5-b579-04e05fb9abde","Type":"ContainerDied","Data":"f04721ddb42e9cd755e7c896173e13f755a78ce831966477cc8477215e4961e1"} Mar 20 14:06:10 crc kubenswrapper[4895]: I0320 14:06:10.553424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l6m" event={"ID":"27bd4b2f-9705-45e5-b579-04e05fb9abde","Type":"ContainerStarted","Data":"559ce637e0b7feafd03de3546bd2732dd5ed3f4af8ea0b53c5126989f09adba0"} Mar 20 14:06:16 crc kubenswrapper[4895]: I0320 14:06:16.660922 4895 generic.go:334] "Generic (PLEG): container finished" podID="27bd4b2f-9705-45e5-b579-04e05fb9abde" containerID="cf854f26cdbd371bfaceb00b23c967747789f45216a03a215009124760c4d050" exitCode=0 Mar 20 14:06:16 crc kubenswrapper[4895]: I0320 14:06:16.661500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l6m" event={"ID":"27bd4b2f-9705-45e5-b579-04e05fb9abde","Type":"ContainerDied","Data":"cf854f26cdbd371bfaceb00b23c967747789f45216a03a215009124760c4d050"} Mar 20 14:06:17 crc kubenswrapper[4895]: I0320 14:06:17.673351 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l6m" event={"ID":"27bd4b2f-9705-45e5-b579-04e05fb9abde","Type":"ContainerStarted","Data":"88c2d41834eb6cba7e28daac7dea19cdd479a7ad2665c4bb4d8ccaec476b03f2"} Mar 20 14:06:18 crc kubenswrapper[4895]: I0320 14:06:18.707710 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f6l6m" podStartSLOduration=4.029626826 podStartE2EDuration="10.707688304s" podCreationTimestamp="2026-03-20 14:06:08 +0000 UTC" firstStartedPulling="2026-03-20 14:06:10.5549122 +0000 UTC m=+2670.064631166" lastFinishedPulling="2026-03-20 14:06:17.232973668 +0000 UTC m=+2676.742692644" observedRunningTime="2026-03-20 14:06:18.701064133 +0000 UTC m=+2678.210783089" watchObservedRunningTime="2026-03-20 14:06:18.707688304 +0000 UTC m=+2678.217407270" Mar 20 14:06:19 crc kubenswrapper[4895]: I0320 14:06:19.240260 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:19 crc kubenswrapper[4895]: I0320 14:06:19.240384 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:19 crc kubenswrapper[4895]: I0320 14:06:19.289028 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:19 crc kubenswrapper[4895]: I0320 14:06:19.925995 4895 scope.go:117] "RemoveContainer" containerID="1a78a8aadac52a1aabb49b4d64c647cdd504c41ea3e849df2d7a019355d21843" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.403221 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.406436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.423483 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.580784 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.581165 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.581206 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5fb\" (UniqueName: \"kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.683461 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.683545 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5fb\" (UniqueName: \"kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.683704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.683965 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.684113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.721366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5fb\" (UniqueName: \"kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb\") pod \"redhat-operators-6vfbh\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:22 crc kubenswrapper[4895]: I0320 14:06:22.730352 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:23 crc kubenswrapper[4895]: I0320 14:06:23.311875 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:23 crc kubenswrapper[4895]: I0320 14:06:23.787355 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerID="1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2" exitCode=0 Mar 20 14:06:23 crc kubenswrapper[4895]: I0320 14:06:23.787575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerDied","Data":"1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2"} Mar 20 14:06:23 crc kubenswrapper[4895]: I0320 14:06:23.787815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerStarted","Data":"4a628019f814507cc4ae612ac213e100665590f9abbd6486d2056cb7ddff31ed"} Mar 20 14:06:25 crc kubenswrapper[4895]: I0320 14:06:25.811724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerStarted","Data":"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc"} Mar 20 14:06:29 crc kubenswrapper[4895]: I0320 14:06:29.268831 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f6l6m" Mar 20 14:06:29 crc kubenswrapper[4895]: I0320 14:06:29.335791 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6l6m"] Mar 20 14:06:29 crc kubenswrapper[4895]: I0320 14:06:29.382881 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 14:06:29 crc kubenswrapper[4895]: I0320 14:06:29.383140 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rx9gl" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="registry-server" containerID="cri-o://4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8" gracePeriod=2 Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.594917 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.670113 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9zsb\" (UniqueName: \"kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb\") pod \"539ac798-b3af-410c-9e5e-4e45263b692b\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.670527 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content\") pod \"539ac798-b3af-410c-9e5e-4e45263b692b\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.670593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities\") pod \"539ac798-b3af-410c-9e5e-4e45263b692b\" (UID: \"539ac798-b3af-410c-9e5e-4e45263b692b\") " Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.672080 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities" (OuterVolumeSpecName: "utilities") pod "539ac798-b3af-410c-9e5e-4e45263b692b" (UID: "539ac798-b3af-410c-9e5e-4e45263b692b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.694632 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb" (OuterVolumeSpecName: "kube-api-access-p9zsb") pod "539ac798-b3af-410c-9e5e-4e45263b692b" (UID: "539ac798-b3af-410c-9e5e-4e45263b692b"). InnerVolumeSpecName "kube-api-access-p9zsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.763388 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "539ac798-b3af-410c-9e5e-4e45263b692b" (UID: "539ac798-b3af-410c-9e5e-4e45263b692b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.773165 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.773201 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539ac798-b3af-410c-9e5e-4e45263b692b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.773213 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9zsb\" (UniqueName: \"kubernetes.io/projected/539ac798-b3af-410c-9e5e-4e45263b692b-kube-api-access-p9zsb\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.873949 4895 generic.go:334] "Generic (PLEG): container finished" podID="539ac798-b3af-410c-9e5e-4e45263b692b" containerID="4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8" exitCode=0 Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.873992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerDied","Data":"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8"} Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.874016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rx9gl" event={"ID":"539ac798-b3af-410c-9e5e-4e45263b692b","Type":"ContainerDied","Data":"45bd137c537a496b88dc46efad4ef71a8a4173460c4b30f43ca757790f24a40b"} Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.874032 4895 scope.go:117] "RemoveContainer" containerID="4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.874150 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rx9gl" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.903306 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.911501 4895 scope.go:117] "RemoveContainer" containerID="509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.921058 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rx9gl"] Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.932798 4895 scope.go:117] "RemoveContainer" containerID="cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.977276 4895 scope.go:117] "RemoveContainer" containerID="4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8" Mar 20 14:06:30 crc kubenswrapper[4895]: E0320 14:06:30.977730 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8\": container with ID starting with 4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8 not found: ID does not exist" containerID="4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.977757 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8"} err="failed to get container status \"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8\": rpc error: code = NotFound desc = could not find container \"4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8\": container with ID starting with 4eac931f2521823b691707027f8c57640d9a9f1c8b08f4c758265527531d39d8 not found: ID does not exist" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.977778 4895 scope.go:117] "RemoveContainer" containerID="509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb" Mar 20 14:06:30 crc kubenswrapper[4895]: E0320 14:06:30.978284 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb\": container with ID starting with 509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb not found: ID does not exist" containerID="509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.978307 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb"} err="failed to get container status \"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb\": rpc error: code = NotFound desc = could not find container \"509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb\": container with ID starting with 509b76a2c97e9b7768693d6d679954092961af6e1d161d0e1df4258e1bc0c5fb not found: ID does not exist" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.978322 4895 scope.go:117] "RemoveContainer" containerID="cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6" Mar 20 14:06:30 crc kubenswrapper[4895]: E0320 14:06:30.978593 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6\": container with ID starting with cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6 not found: ID does not exist" containerID="cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6" Mar 20 14:06:30 crc kubenswrapper[4895]: I0320 14:06:30.978614 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6"} err="failed to get container status \"cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6\": rpc error: code = NotFound desc = could not find container \"cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6\": container with ID starting with cf75a8d23b6b5cb47d41103608e8567c673c4b91ab2b433ca6af1543122259e6 not found: ID does not exist" Mar 20 14:06:31 crc kubenswrapper[4895]: I0320 14:06:31.226576 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" path="/var/lib/kubelet/pods/539ac798-b3af-410c-9e5e-4e45263b692b/volumes" Mar 20 14:06:31 crc kubenswrapper[4895]: I0320 14:06:31.885359 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerID="7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc" exitCode=0 Mar 20 14:06:31 crc kubenswrapper[4895]: I0320 14:06:31.885486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerDied","Data":"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc"} Mar 20 14:06:32 crc kubenswrapper[4895]: I0320 14:06:32.900145 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerStarted","Data":"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f"} Mar 20 14:06:32 crc kubenswrapper[4895]: I0320 14:06:32.925609 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6vfbh" podStartSLOduration=2.389166683 podStartE2EDuration="10.925591114s" podCreationTimestamp="2026-03-20 14:06:22 +0000 UTC" firstStartedPulling="2026-03-20 14:06:23.78924395 +0000 UTC m=+2683.298962906" lastFinishedPulling="2026-03-20 14:06:32.325668371 +0000 UTC m=+2691.835387337" observedRunningTime="2026-03-20 14:06:32.918223656 +0000 UTC m=+2692.427942622" watchObservedRunningTime="2026-03-20 14:06:32.925591114 +0000 UTC m=+2692.435310080" Mar 20 14:06:42 crc kubenswrapper[4895]: I0320 14:06:42.730725 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:42 crc kubenswrapper[4895]: I0320 14:06:42.731311 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:43 crc kubenswrapper[4895]: I0320 14:06:43.812322 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6vfbh" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="registry-server" probeResult="failure" output=< Mar 20 14:06:43 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:06:43 crc kubenswrapper[4895]: > Mar 20 14:06:52 crc kubenswrapper[4895]: I0320 14:06:52.832258 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:52 crc kubenswrapper[4895]: I0320 14:06:52.888226 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:53 crc kubenswrapper[4895]: I0320 14:06:53.078814 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.151752 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6vfbh" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="registry-server" containerID="cri-o://d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f" gracePeriod=2 Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.708208 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.791491 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities\") pod \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.791719 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j5fb\" (UniqueName: \"kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb\") pod \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.791822 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content\") pod \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\" (UID: \"3f49ce9e-adb2-4899-a895-01ec8ee6a926\") " Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.792491 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities" (OuterVolumeSpecName: "utilities") pod "3f49ce9e-adb2-4899-a895-01ec8ee6a926" (UID: "3f49ce9e-adb2-4899-a895-01ec8ee6a926"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.798227 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb" (OuterVolumeSpecName: "kube-api-access-8j5fb") pod "3f49ce9e-adb2-4899-a895-01ec8ee6a926" (UID: "3f49ce9e-adb2-4899-a895-01ec8ee6a926"). InnerVolumeSpecName "kube-api-access-8j5fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.798783 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j5fb\" (UniqueName: \"kubernetes.io/projected/3f49ce9e-adb2-4899-a895-01ec8ee6a926-kube-api-access-8j5fb\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.798822 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:54 crc kubenswrapper[4895]: I0320 14:06:54.975360 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f49ce9e-adb2-4899-a895-01ec8ee6a926" (UID: "3f49ce9e-adb2-4899-a895-01ec8ee6a926"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.002690 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f49ce9e-adb2-4899-a895-01ec8ee6a926-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.163145 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerID="d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f" exitCode=0 Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.163185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerDied","Data":"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f"} Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.163207 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vfbh" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.163222 4895 scope.go:117] "RemoveContainer" containerID="d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.163210 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vfbh" event={"ID":"3f49ce9e-adb2-4899-a895-01ec8ee6a926","Type":"ContainerDied","Data":"4a628019f814507cc4ae612ac213e100665590f9abbd6486d2056cb7ddff31ed"} Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.191743 4895 scope.go:117] "RemoveContainer" containerID="7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.199208 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.210239 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6vfbh"] Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.223130 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" path="/var/lib/kubelet/pods/3f49ce9e-adb2-4899-a895-01ec8ee6a926/volumes" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.223431 4895 scope.go:117] "RemoveContainer" containerID="1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.265226 4895 scope.go:117] "RemoveContainer" containerID="d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f" Mar 20 14:06:55 crc kubenswrapper[4895]: E0320 14:06:55.265758 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f\": container with ID starting with d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f not found: ID does not exist" containerID="d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.265816 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f"} err="failed to get container status \"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f\": rpc error: code = NotFound desc = could not find container \"d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f\": container with ID starting with d1f69895b4db6f305f3f87cba332b428fdc5e029779751986e82c59c8bad7c6f not found: ID does not exist" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.265849 4895 scope.go:117] "RemoveContainer" containerID="7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc" Mar 20 14:06:55 crc kubenswrapper[4895]: E0320 14:06:55.266355 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc\": container with ID starting with 7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc not found: ID does not exist" containerID="7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.266416 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc"} err="failed to get container status \"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc\": rpc error: code = NotFound desc = could not find container \"7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc\": container with ID starting with 7821f05e7e1457411ee8079ce3aa424099e99cd6d5859851cc74b22eabebe9fc not found: ID does not exist" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.266447 4895 scope.go:117] "RemoveContainer" containerID="1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2" Mar 20 14:06:55 crc kubenswrapper[4895]: E0320 14:06:55.266858 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2\": container with ID starting with 1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2 not found: ID does not exist" containerID="1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2" Mar 20 14:06:55 crc kubenswrapper[4895]: I0320 14:06:55.266897 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2"} err="failed to get container status \"1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2\": rpc error: code = NotFound desc = could not find container \"1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2\": container with ID starting with 1bf352208785c7b4a507ed783ec629b6df387b79b19ed848375879944b8a16d2 not found: ID does not exist" Mar 20 14:07:22 crc kubenswrapper[4895]: I0320 14:07:22.296651 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:07:22 crc kubenswrapper[4895]: I0320 14:07:22.298126 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:07:52 crc kubenswrapper[4895]: I0320 14:07:52.297143 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:07:52 crc kubenswrapper[4895]: I0320 14:07:52.297925 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.154923 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566928-bcvc2"] Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.156750 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.156842 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.156910 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.156969 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.157029 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.157121 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.157235 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.157317 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.157382 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.157494 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: E0320 14:08:00.157578 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.157636 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.157900 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f49ce9e-adb2-4899-a895-01ec8ee6a926" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.158005 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="539ac798-b3af-410c-9e5e-4e45263b692b" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.158821 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.161098 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.161327 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.161718 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.170329 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-bcvc2"] Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.287566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rcp\" (UniqueName: \"kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp\") pod \"auto-csr-approver-29566928-bcvc2\" (UID: \"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5\") " pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.389858 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rcp\" (UniqueName: \"kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp\") pod \"auto-csr-approver-29566928-bcvc2\" (UID: \"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5\") " pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.414140 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rcp\" (UniqueName: \"kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp\") pod \"auto-csr-approver-29566928-bcvc2\" (UID: \"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5\") " pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.479148 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:00 crc kubenswrapper[4895]: I0320 14:08:00.953902 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-bcvc2"] Mar 20 14:08:00 crc kubenswrapper[4895]: W0320 14:08:00.961474 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b4e72e7_4efc_4ef4_b52f_a4bbfc3842b5.slice/crio-31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17 WatchSource:0}: Error finding container 31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17: Status 404 returned error can't find the container with id 31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17 Mar 20 14:08:01 crc kubenswrapper[4895]: I0320 14:08:01.826721 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" event={"ID":"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5","Type":"ContainerStarted","Data":"31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17"} Mar 20 14:08:03 crc kubenswrapper[4895]: I0320 14:08:03.856765 4895 generic.go:334] "Generic (PLEG): container finished" podID="7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" containerID="db3aee303cc8cbc64ac7ab6bfa12e72c70251487cd335be85b0f88a2e70687eb" exitCode=0 Mar 20 14:08:03 crc kubenswrapper[4895]: I0320 14:08:03.856849 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" event={"ID":"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5","Type":"ContainerDied","Data":"db3aee303cc8cbc64ac7ab6bfa12e72c70251487cd335be85b0f88a2e70687eb"} Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.251408 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.410539 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4rcp\" (UniqueName: \"kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp\") pod \"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5\" (UID: \"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5\") " Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.416997 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp" (OuterVolumeSpecName: "kube-api-access-b4rcp") pod "7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" (UID: "7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5"). InnerVolumeSpecName "kube-api-access-b4rcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.513536 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4rcp\" (UniqueName: \"kubernetes.io/projected/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5-kube-api-access-b4rcp\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.876640 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" event={"ID":"7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5","Type":"ContainerDied","Data":"31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17"} Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.876688 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31045fa8317e171177730ecfd23e4c7cb620e451155628b4d6e2b8b8837efe17" Mar 20 14:08:05 crc kubenswrapper[4895]: I0320 14:08:05.876694 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-bcvc2" Mar 20 14:08:06 crc kubenswrapper[4895]: I0320 14:08:06.329908 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-wrp5k"] Mar 20 14:08:06 crc kubenswrapper[4895]: I0320 14:08:06.341795 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-wrp5k"] Mar 20 14:08:07 crc kubenswrapper[4895]: I0320 14:08:07.229976 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11e21c6-22cf-486b-81fc-7aeb0d1aa329" path="/var/lib/kubelet/pods/b11e21c6-22cf-486b-81fc-7aeb0d1aa329/volumes" Mar 20 14:08:20 crc kubenswrapper[4895]: I0320 14:08:20.062014 4895 scope.go:117] "RemoveContainer" containerID="529c2e70aec01c059701877c289186d4c3f67e07f3fc3c173b2cba3ac1e8e893" Mar 20 14:08:22 crc kubenswrapper[4895]: I0320 14:08:22.297682 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:08:22 crc kubenswrapper[4895]: I0320 14:08:22.298047 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:08:22 crc kubenswrapper[4895]: I0320 14:08:22.298121 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:08:22 crc kubenswrapper[4895]: I0320 14:08:22.300679 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:08:22 crc kubenswrapper[4895]: I0320 14:08:22.300864 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55" gracePeriod=600 Mar 20 14:08:22 crc kubenswrapper[4895]: E0320 14:08:22.685592 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9e3134_0fea_4e77_a1e4_e74835ee41e8.slice/crio-conmon-b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:08:23 crc kubenswrapper[4895]: I0320 14:08:23.058620 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55" exitCode=0 Mar 20 14:08:23 crc kubenswrapper[4895]: I0320 14:08:23.058685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55"} Mar 20 14:08:23 crc kubenswrapper[4895]: I0320 14:08:23.058986 4895 scope.go:117] "RemoveContainer" containerID="beb74651059aa0b679f0b28f1c635735b08740cabd05d23570b60e1f9e3298f8" Mar 20 14:08:24 crc kubenswrapper[4895]: I0320 14:08:24.075115 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633"} Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.428856 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:33 crc kubenswrapper[4895]: E0320 14:08:33.430726 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" containerName="oc" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.430917 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" containerName="oc" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.431178 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" containerName="oc" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.432783 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.440027 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.504169 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.504231 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxb6b\" (UniqueName: \"kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.504265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.606614 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.606691 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxb6b\" (UniqueName: \"kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.606744 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.607405 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.607663 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.628651 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxb6b\" (UniqueName: \"kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b\") pod \"community-operators-7cp84\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:33 crc kubenswrapper[4895]: I0320 14:08:33.814502 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:34 crc kubenswrapper[4895]: I0320 14:08:34.339606 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:35 crc kubenswrapper[4895]: I0320 14:08:35.205032 4895 generic.go:334] "Generic (PLEG): container finished" podID="8f7f46bf-5d99-4185-925d-2909e944820c" containerID="98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285" exitCode=0 Mar 20 14:08:35 crc kubenswrapper[4895]: I0320 14:08:35.205136 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerDied","Data":"98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285"} Mar 20 14:08:35 crc kubenswrapper[4895]: I0320 14:08:35.207664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerStarted","Data":"a116286ebdd10241994dfda0c8eb1f14d4ebcfcd91e221406e7dafefe27fc11a"} Mar 20 14:08:37 crc kubenswrapper[4895]: I0320 14:08:37.227098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerStarted","Data":"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58"} Mar 20 14:08:40 crc kubenswrapper[4895]: I0320 14:08:40.264952 4895 generic.go:334] "Generic (PLEG): container finished" podID="8f7f46bf-5d99-4185-925d-2909e944820c" containerID="1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58" exitCode=0 Mar 20 14:08:40 crc kubenswrapper[4895]: I0320 14:08:40.265021 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerDied","Data":"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58"} Mar 20 14:08:41 crc kubenswrapper[4895]: I0320 14:08:41.277408 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerStarted","Data":"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39"} Mar 20 14:08:41 crc kubenswrapper[4895]: I0320 14:08:41.301140 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7cp84" podStartSLOduration=2.534027188 podStartE2EDuration="8.301121826s" podCreationTimestamp="2026-03-20 14:08:33 +0000 UTC" firstStartedPulling="2026-03-20 14:08:35.20761538 +0000 UTC m=+2814.717334346" lastFinishedPulling="2026-03-20 14:08:40.974710018 +0000 UTC m=+2820.484428984" observedRunningTime="2026-03-20 14:08:41.298838461 +0000 UTC m=+2820.808557427" watchObservedRunningTime="2026-03-20 14:08:41.301121826 +0000 UTC m=+2820.810840792" Mar 20 14:08:43 crc kubenswrapper[4895]: I0320 14:08:43.815160 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:43 crc kubenswrapper[4895]: I0320 14:08:43.815792 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:43 crc kubenswrapper[4895]: I0320 14:08:43.866945 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:53 crc kubenswrapper[4895]: I0320 14:08:53.865475 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:53 crc kubenswrapper[4895]: I0320 14:08:53.920723 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.400603 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7cp84" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="registry-server" containerID="cri-o://a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39" gracePeriod=2 Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.962985 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.995525 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities\") pod \"8f7f46bf-5d99-4185-925d-2909e944820c\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.995772 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxb6b\" (UniqueName: \"kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b\") pod \"8f7f46bf-5d99-4185-925d-2909e944820c\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.995916 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content\") pod \"8f7f46bf-5d99-4185-925d-2909e944820c\" (UID: \"8f7f46bf-5d99-4185-925d-2909e944820c\") " Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.996547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities" (OuterVolumeSpecName: "utilities") pod "8f7f46bf-5d99-4185-925d-2909e944820c" (UID: "8f7f46bf-5d99-4185-925d-2909e944820c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:08:54 crc kubenswrapper[4895]: I0320 14:08:54.997143 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.003126 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b" (OuterVolumeSpecName: "kube-api-access-pxb6b") pod "8f7f46bf-5d99-4185-925d-2909e944820c" (UID: "8f7f46bf-5d99-4185-925d-2909e944820c"). InnerVolumeSpecName "kube-api-access-pxb6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.048991 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f7f46bf-5d99-4185-925d-2909e944820c" (UID: "8f7f46bf-5d99-4185-925d-2909e944820c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.099686 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f7f46bf-5d99-4185-925d-2909e944820c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.099726 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxb6b\" (UniqueName: \"kubernetes.io/projected/8f7f46bf-5d99-4185-925d-2909e944820c-kube-api-access-pxb6b\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.413922 4895 generic.go:334] "Generic (PLEG): container finished" podID="8f7f46bf-5d99-4185-925d-2909e944820c" containerID="a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39" exitCode=0 Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.414024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerDied","Data":"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39"} Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.414028 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cp84" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.414068 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cp84" event={"ID":"8f7f46bf-5d99-4185-925d-2909e944820c","Type":"ContainerDied","Data":"a116286ebdd10241994dfda0c8eb1f14d4ebcfcd91e221406e7dafefe27fc11a"} Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.414092 4895 scope.go:117] "RemoveContainer" containerID="a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.443477 4895 scope.go:117] "RemoveContainer" containerID="1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.447771 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.457372 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7cp84"] Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.479915 4895 scope.go:117] "RemoveContainer" containerID="98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.543717 4895 scope.go:117] "RemoveContainer" containerID="a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39" Mar 20 14:08:55 crc kubenswrapper[4895]: E0320 14:08:55.545275 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39\": container with ID starting with a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39 not found: ID does not exist" containerID="a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.545323 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39"} err="failed to get container status \"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39\": rpc error: code = NotFound desc = could not find container \"a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39\": container with ID starting with a7408ef7595caddafbfbe4973df1d5fb7d945b19f412da2e1312c6d8c0696d39 not found: ID does not exist" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.545355 4895 scope.go:117] "RemoveContainer" containerID="1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58" Mar 20 14:08:55 crc kubenswrapper[4895]: E0320 14:08:55.545852 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58\": container with ID starting with 1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58 not found: ID does not exist" containerID="1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.545900 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58"} err="failed to get container status \"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58\": rpc error: code = NotFound desc = could not find container \"1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58\": container with ID starting with 1beff83798ad1f3f2e494c5ebba696a63642682c3be92db9a753a231ca1aad58 not found: ID does not exist" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.545934 4895 scope.go:117] "RemoveContainer" containerID="98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285" Mar 20 14:08:55 crc kubenswrapper[4895]: E0320 14:08:55.546327 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285\": container with ID starting with 98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285 not found: ID does not exist" containerID="98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285" Mar 20 14:08:55 crc kubenswrapper[4895]: I0320 14:08:55.546383 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285"} err="failed to get container status \"98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285\": rpc error: code = NotFound desc = could not find container \"98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285\": container with ID starting with 98f7e774ee81d58ceef2de55f8e730c5295691c7752ac181f75d4f35ceb26285 not found: ID does not exist" Mar 20 14:08:57 crc kubenswrapper[4895]: I0320 14:08:57.225630 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" path="/var/lib/kubelet/pods/8f7f46bf-5d99-4185-925d-2909e944820c/volumes" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.152209 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566930-prcw4"] Mar 20 14:10:00 crc kubenswrapper[4895]: E0320 14:10:00.155709 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.155842 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4895]: E0320 14:10:00.155906 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.155959 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4895]: E0320 14:10:00.156076 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.156133 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.156846 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7f46bf-5d99-4185-925d-2909e944820c" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.158147 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.160729 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.161824 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.162159 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.171211 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-prcw4"] Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.359228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwrw\" (UniqueName: \"kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw\") pod \"auto-csr-approver-29566930-prcw4\" (UID: \"9ff5611f-af0e-414c-87e2-0371885d6e96\") " pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.466663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwrw\" (UniqueName: \"kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw\") pod \"auto-csr-approver-29566930-prcw4\" (UID: \"9ff5611f-af0e-414c-87e2-0371885d6e96\") " pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.487996 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwrw\" (UniqueName: \"kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw\") pod \"auto-csr-approver-29566930-prcw4\" (UID: \"9ff5611f-af0e-414c-87e2-0371885d6e96\") " pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:00 crc kubenswrapper[4895]: I0320 14:10:00.783623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:01 crc kubenswrapper[4895]: I0320 14:10:01.265547 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-prcw4"] Mar 20 14:10:01 crc kubenswrapper[4895]: I0320 14:10:01.277340 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:10:02 crc kubenswrapper[4895]: I0320 14:10:02.069848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-prcw4" event={"ID":"9ff5611f-af0e-414c-87e2-0371885d6e96","Type":"ContainerStarted","Data":"c6e1bddb4fda0263dd332f1bc4e4d9456a57213de74216a5e80d94f7d82b098c"} Mar 20 14:10:04 crc kubenswrapper[4895]: I0320 14:10:04.087122 4895 generic.go:334] "Generic (PLEG): container finished" podID="9ff5611f-af0e-414c-87e2-0371885d6e96" containerID="4ac35a73d03b76637764d932718c8b591ed6d0c9c99346cdd1d48fa6420dbc58" exitCode=0 Mar 20 14:10:04 crc kubenswrapper[4895]: I0320 14:10:04.087228 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-prcw4" event={"ID":"9ff5611f-af0e-414c-87e2-0371885d6e96","Type":"ContainerDied","Data":"4ac35a73d03b76637764d932718c8b591ed6d0c9c99346cdd1d48fa6420dbc58"} Mar 20 14:10:05 crc kubenswrapper[4895]: I0320 14:10:05.513161 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:05 crc kubenswrapper[4895]: I0320 14:10:05.579586 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlwrw\" (UniqueName: \"kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw\") pod \"9ff5611f-af0e-414c-87e2-0371885d6e96\" (UID: \"9ff5611f-af0e-414c-87e2-0371885d6e96\") " Mar 20 14:10:05 crc kubenswrapper[4895]: I0320 14:10:05.587537 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw" (OuterVolumeSpecName: "kube-api-access-rlwrw") pod "9ff5611f-af0e-414c-87e2-0371885d6e96" (UID: "9ff5611f-af0e-414c-87e2-0371885d6e96"). InnerVolumeSpecName "kube-api-access-rlwrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:05 crc kubenswrapper[4895]: I0320 14:10:05.682132 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlwrw\" (UniqueName: \"kubernetes.io/projected/9ff5611f-af0e-414c-87e2-0371885d6e96-kube-api-access-rlwrw\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:06 crc kubenswrapper[4895]: I0320 14:10:06.108574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-prcw4" event={"ID":"9ff5611f-af0e-414c-87e2-0371885d6e96","Type":"ContainerDied","Data":"c6e1bddb4fda0263dd332f1bc4e4d9456a57213de74216a5e80d94f7d82b098c"} Mar 20 14:10:06 crc kubenswrapper[4895]: I0320 14:10:06.108891 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6e1bddb4fda0263dd332f1bc4e4d9456a57213de74216a5e80d94f7d82b098c" Mar 20 14:10:06 crc kubenswrapper[4895]: I0320 14:10:06.108674 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-prcw4" Mar 20 14:10:06 crc kubenswrapper[4895]: I0320 14:10:06.583556 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-vhjzn"] Mar 20 14:10:06 crc kubenswrapper[4895]: I0320 14:10:06.593112 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-vhjzn"] Mar 20 14:10:07 crc kubenswrapper[4895]: I0320 14:10:07.223776 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a3e8f0-a68d-4612-987f-4624c41bf952" path="/var/lib/kubelet/pods/46a3e8f0-a68d-4612-987f-4624c41bf952/volumes" Mar 20 14:10:20 crc kubenswrapper[4895]: I0320 14:10:20.812148 4895 scope.go:117] "RemoveContainer" containerID="2decac017c3f4fe1a1099f20d620cd605c539df53f67adddd3f031a42ff252fe" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.405355 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:29 crc kubenswrapper[4895]: E0320 14:10:29.406561 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff5611f-af0e-414c-87e2-0371885d6e96" containerName="oc" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.406575 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff5611f-af0e-414c-87e2-0371885d6e96" containerName="oc" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.406826 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff5611f-af0e-414c-87e2-0371885d6e96" containerName="oc" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.409079 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.421569 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.478462 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64nqd\" (UniqueName: \"kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.478641 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.478726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.581769 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64nqd\" (UniqueName: \"kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.581894 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.581926 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.582513 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.582579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.607563 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64nqd\" (UniqueName: \"kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd\") pod \"redhat-marketplace-cggz7\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:29 crc kubenswrapper[4895]: I0320 14:10:29.730503 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:30 crc kubenswrapper[4895]: I0320 14:10:30.222341 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:30 crc kubenswrapper[4895]: I0320 14:10:30.359576 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerStarted","Data":"cfe10e003320a191147311b488bc9d26fcdb87cdf83ab316b74a47e0dd2e4b62"} Mar 20 14:10:31 crc kubenswrapper[4895]: I0320 14:10:31.369936 4895 generic.go:334] "Generic (PLEG): container finished" podID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerID="3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a" exitCode=0 Mar 20 14:10:31 crc kubenswrapper[4895]: I0320 14:10:31.370001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerDied","Data":"3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a"} Mar 20 14:10:33 crc kubenswrapper[4895]: I0320 14:10:33.395699 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerStarted","Data":"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209"} Mar 20 14:10:36 crc kubenswrapper[4895]: I0320 14:10:36.429959 4895 generic.go:334] "Generic (PLEG): container finished" podID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerID="2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209" exitCode=0 Mar 20 14:10:36 crc kubenswrapper[4895]: I0320 14:10:36.430035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerDied","Data":"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209"} Mar 20 14:10:38 crc kubenswrapper[4895]: I0320 14:10:38.448114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerStarted","Data":"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a"} Mar 20 14:10:38 crc kubenswrapper[4895]: I0320 14:10:38.497811 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cggz7" podStartSLOduration=3.600241228 podStartE2EDuration="9.497771141s" podCreationTimestamp="2026-03-20 14:10:29 +0000 UTC" firstStartedPulling="2026-03-20 14:10:31.373973439 +0000 UTC m=+2930.883692405" lastFinishedPulling="2026-03-20 14:10:37.271503352 +0000 UTC m=+2936.781222318" observedRunningTime="2026-03-20 14:10:38.467975127 +0000 UTC m=+2937.977694093" watchObservedRunningTime="2026-03-20 14:10:38.497771141 +0000 UTC m=+2938.007490107" Mar 20 14:10:39 crc kubenswrapper[4895]: I0320 14:10:39.731667 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:39 crc kubenswrapper[4895]: I0320 14:10:39.731755 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:39 crc kubenswrapper[4895]: I0320 14:10:39.783265 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:49 crc kubenswrapper[4895]: I0320 14:10:49.779150 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:49 crc kubenswrapper[4895]: I0320 14:10:49.831787 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:50 crc kubenswrapper[4895]: I0320 14:10:50.569334 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cggz7" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="registry-server" containerID="cri-o://7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a" gracePeriod=2 Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.330684 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.480345 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content\") pod \"424321fb-bc9c-4a65-b091-d70e9b8c5947\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.480480 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities\") pod \"424321fb-bc9c-4a65-b091-d70e9b8c5947\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.480568 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64nqd\" (UniqueName: \"kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd\") pod \"424321fb-bc9c-4a65-b091-d70e9b8c5947\" (UID: \"424321fb-bc9c-4a65-b091-d70e9b8c5947\") " Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.482204 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities" (OuterVolumeSpecName: "utilities") pod "424321fb-bc9c-4a65-b091-d70e9b8c5947" (UID: "424321fb-bc9c-4a65-b091-d70e9b8c5947"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.489416 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd" (OuterVolumeSpecName: "kube-api-access-64nqd") pod "424321fb-bc9c-4a65-b091-d70e9b8c5947" (UID: "424321fb-bc9c-4a65-b091-d70e9b8c5947"). InnerVolumeSpecName "kube-api-access-64nqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.514206 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "424321fb-bc9c-4a65-b091-d70e9b8c5947" (UID: "424321fb-bc9c-4a65-b091-d70e9b8c5947"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.583538 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.583784 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64nqd\" (UniqueName: \"kubernetes.io/projected/424321fb-bc9c-4a65-b091-d70e9b8c5947-kube-api-access-64nqd\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.583802 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424321fb-bc9c-4a65-b091-d70e9b8c5947-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.584311 4895 generic.go:334] "Generic (PLEG): container finished" podID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerID="7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a" exitCode=0 Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.584442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerDied","Data":"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a"} Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.584520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cggz7" event={"ID":"424321fb-bc9c-4a65-b091-d70e9b8c5947","Type":"ContainerDied","Data":"cfe10e003320a191147311b488bc9d26fcdb87cdf83ab316b74a47e0dd2e4b62"} Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.584550 4895 scope.go:117] "RemoveContainer" containerID="7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.584567 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cggz7" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.636679 4895 scope.go:117] "RemoveContainer" containerID="2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.642299 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.657955 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cggz7"] Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.804641 4895 scope.go:117] "RemoveContainer" containerID="3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.829904 4895 scope.go:117] "RemoveContainer" containerID="7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a" Mar 20 14:10:51 crc kubenswrapper[4895]: E0320 14:10:51.830486 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a\": container with ID starting with 7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a not found: ID does not exist" containerID="7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.830643 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a"} err="failed to get container status \"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a\": rpc error: code = NotFound desc = could not find container \"7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a\": container with ID starting with 7ce9464151fb4dbc60d01d29c2aeb0d489791f903df81df5db4186067e023c5a not found: ID does not exist" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.830699 4895 scope.go:117] "RemoveContainer" containerID="2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209" Mar 20 14:10:51 crc kubenswrapper[4895]: E0320 14:10:51.831211 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209\": container with ID starting with 2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209 not found: ID does not exist" containerID="2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.831251 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209"} err="failed to get container status \"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209\": rpc error: code = NotFound desc = could not find container \"2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209\": container with ID starting with 2eb43e0205523bee8a7782faa79bed87e2ddd77a7699a8d66ec2243eec5d7209 not found: ID does not exist" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.831281 4895 scope.go:117] "RemoveContainer" containerID="3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a" Mar 20 14:10:51 crc kubenswrapper[4895]: E0320 14:10:51.831587 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a\": container with ID starting with 3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a not found: ID does not exist" containerID="3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a" Mar 20 14:10:51 crc kubenswrapper[4895]: I0320 14:10:51.831608 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a"} err="failed to get container status \"3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a\": rpc error: code = NotFound desc = could not find container \"3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a\": container with ID starting with 3248e21fcc3c974c6f17396f19d59e43b9f64b36e926f374a033c9ad61a1885a not found: ID does not exist" Mar 20 14:10:52 crc kubenswrapper[4895]: I0320 14:10:52.297305 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:10:52 crc kubenswrapper[4895]: I0320 14:10:52.297422 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:10:53 crc kubenswrapper[4895]: I0320 14:10:53.240709 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" path="/var/lib/kubelet/pods/424321fb-bc9c-4a65-b091-d70e9b8c5947/volumes" Mar 20 14:11:22 crc kubenswrapper[4895]: I0320 14:11:22.297927 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:11:22 crc kubenswrapper[4895]: I0320 14:11:22.298440 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.297366 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.297968 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.298050 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.298958 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.299014 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" gracePeriod=600 Mar 20 14:11:52 crc kubenswrapper[4895]: E0320 14:11:52.421835 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.594493 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" exitCode=0 Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.594686 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633"} Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.595146 4895 scope.go:117] "RemoveContainer" containerID="b097e398cc43ae4c522523ec661c19c40195dca81adea8eec4f0ed2b2bc79b55" Mar 20 14:11:52 crc kubenswrapper[4895]: I0320 14:11:52.595991 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:11:52 crc kubenswrapper[4895]: E0320 14:11:52.596325 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.139005 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xzzs8"] Mar 20 14:12:00 crc kubenswrapper[4895]: E0320 14:12:00.139903 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="extract-utilities" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.139916 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="extract-utilities" Mar 20 14:12:00 crc kubenswrapper[4895]: E0320 14:12:00.139928 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="extract-content" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.139933 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="extract-content" Mar 20 14:12:00 crc kubenswrapper[4895]: E0320 14:12:00.139951 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.139958 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.140179 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="424321fb-bc9c-4a65-b091-d70e9b8c5947" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.141045 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.143185 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.143260 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.143193 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.148867 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xzzs8"] Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.290124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzzr\" (UniqueName: \"kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr\") pod \"auto-csr-approver-29566932-xzzs8\" (UID: \"5616ecdf-c26d-4560-81ff-dcaa5dc5f395\") " pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.392042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzzr\" (UniqueName: \"kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr\") pod \"auto-csr-approver-29566932-xzzs8\" (UID: \"5616ecdf-c26d-4560-81ff-dcaa5dc5f395\") " pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.413108 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzzr\" (UniqueName: \"kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr\") pod \"auto-csr-approver-29566932-xzzs8\" (UID: \"5616ecdf-c26d-4560-81ff-dcaa5dc5f395\") " pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.460449 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:00 crc kubenswrapper[4895]: I0320 14:12:00.911868 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xzzs8"] Mar 20 14:12:01 crc kubenswrapper[4895]: I0320 14:12:01.678712 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" event={"ID":"5616ecdf-c26d-4560-81ff-dcaa5dc5f395","Type":"ContainerStarted","Data":"0bc8f2e2b287888d9ef1e65e811a862318b2448aa3b01dd734c8bde828fb2684"} Mar 20 14:12:03 crc kubenswrapper[4895]: I0320 14:12:03.702645 4895 generic.go:334] "Generic (PLEG): container finished" podID="5616ecdf-c26d-4560-81ff-dcaa5dc5f395" containerID="ac813b9990ca5a8cfd50eab2e9d44d04b4a5dbe47585781be8f826f56141e58c" exitCode=0 Mar 20 14:12:03 crc kubenswrapper[4895]: I0320 14:12:03.702722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" event={"ID":"5616ecdf-c26d-4560-81ff-dcaa5dc5f395","Type":"ContainerDied","Data":"ac813b9990ca5a8cfd50eab2e9d44d04b4a5dbe47585781be8f826f56141e58c"} Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.117508 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.211564 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzzr\" (UniqueName: \"kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr\") pod \"5616ecdf-c26d-4560-81ff-dcaa5dc5f395\" (UID: \"5616ecdf-c26d-4560-81ff-dcaa5dc5f395\") " Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.212228 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:12:05 crc kubenswrapper[4895]: E0320 14:12:05.212537 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.217593 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr" (OuterVolumeSpecName: "kube-api-access-6fzzr") pod "5616ecdf-c26d-4560-81ff-dcaa5dc5f395" (UID: "5616ecdf-c26d-4560-81ff-dcaa5dc5f395"). InnerVolumeSpecName "kube-api-access-6fzzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.314910 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzzr\" (UniqueName: \"kubernetes.io/projected/5616ecdf-c26d-4560-81ff-dcaa5dc5f395-kube-api-access-6fzzr\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.722295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" event={"ID":"5616ecdf-c26d-4560-81ff-dcaa5dc5f395","Type":"ContainerDied","Data":"0bc8f2e2b287888d9ef1e65e811a862318b2448aa3b01dd734c8bde828fb2684"} Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.722336 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc8f2e2b287888d9ef1e65e811a862318b2448aa3b01dd734c8bde828fb2684" Mar 20 14:12:05 crc kubenswrapper[4895]: I0320 14:12:05.722366 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xzzs8" Mar 20 14:12:06 crc kubenswrapper[4895]: I0320 14:12:06.187208 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-xdvdf"] Mar 20 14:12:06 crc kubenswrapper[4895]: I0320 14:12:06.196975 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-xdvdf"] Mar 20 14:12:07 crc kubenswrapper[4895]: I0320 14:12:07.222369 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76525fa0-1750-46f8-8d3b-04f8feec2f16" path="/var/lib/kubelet/pods/76525fa0-1750-46f8-8d3b-04f8feec2f16/volumes" Mar 20 14:12:18 crc kubenswrapper[4895]: I0320 14:12:18.212382 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:12:18 crc kubenswrapper[4895]: E0320 14:12:18.213273 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:12:21 crc kubenswrapper[4895]: I0320 14:12:21.348768 4895 scope.go:117] "RemoveContainer" containerID="b5add1003fac6cb2d034ec31b7a93f851b4e34c1eaea8d3749af2fae04a0153e" Mar 20 14:12:32 crc kubenswrapper[4895]: I0320 14:12:32.211424 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:12:32 crc kubenswrapper[4895]: E0320 14:12:32.212273 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:12:43 crc kubenswrapper[4895]: I0320 14:12:43.212174 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:12:43 crc kubenswrapper[4895]: E0320 14:12:43.213113 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:12:56 crc kubenswrapper[4895]: I0320 14:12:56.212050 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:12:56 crc kubenswrapper[4895]: E0320 14:12:56.212775 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:13:09 crc kubenswrapper[4895]: I0320 14:13:09.212436 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:13:09 crc kubenswrapper[4895]: E0320 14:13:09.213183 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:13:20 crc kubenswrapper[4895]: I0320 14:13:20.212233 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:13:20 crc kubenswrapper[4895]: E0320 14:13:20.213033 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:13:33 crc kubenswrapper[4895]: I0320 14:13:33.211726 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:13:33 crc kubenswrapper[4895]: E0320 14:13:33.212703 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:13:44 crc kubenswrapper[4895]: I0320 14:13:44.212381 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:13:44 crc kubenswrapper[4895]: E0320 14:13:44.213317 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:13:55 crc kubenswrapper[4895]: I0320 14:13:55.212413 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:13:55 crc kubenswrapper[4895]: E0320 14:13:55.213210 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.176153 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566934-jpprb"] Mar 20 14:14:00 crc kubenswrapper[4895]: E0320 14:14:00.177200 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5616ecdf-c26d-4560-81ff-dcaa5dc5f395" containerName="oc" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.177216 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5616ecdf-c26d-4560-81ff-dcaa5dc5f395" containerName="oc" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.177500 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5616ecdf-c26d-4560-81ff-dcaa5dc5f395" containerName="oc" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.178459 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.180934 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.181118 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.181286 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.186464 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-jpprb"] Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.286025 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kjbx\" (UniqueName: \"kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx\") pod \"auto-csr-approver-29566934-jpprb\" (UID: \"0dbb5ed5-242f-4d8c-afe6-124d33991e31\") " pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.388192 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kjbx\" (UniqueName: \"kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx\") pod \"auto-csr-approver-29566934-jpprb\" (UID: \"0dbb5ed5-242f-4d8c-afe6-124d33991e31\") " pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.416038 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kjbx\" (UniqueName: \"kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx\") pod \"auto-csr-approver-29566934-jpprb\" (UID: \"0dbb5ed5-242f-4d8c-afe6-124d33991e31\") " pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.498831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:00 crc kubenswrapper[4895]: I0320 14:14:00.952915 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-jpprb"] Mar 20 14:14:01 crc kubenswrapper[4895]: I0320 14:14:01.816358 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-jpprb" event={"ID":"0dbb5ed5-242f-4d8c-afe6-124d33991e31","Type":"ContainerStarted","Data":"792afcf0f8d26ea8b681056ffa7964fe9ac1bcc1bc3c9d5643c38a19a47c6291"} Mar 20 14:14:02 crc kubenswrapper[4895]: I0320 14:14:02.825953 4895 generic.go:334] "Generic (PLEG): container finished" podID="0dbb5ed5-242f-4d8c-afe6-124d33991e31" containerID="490d8d9c4be147598efeeac037e1926dff50769bfb47405f7a94380608a5e03b" exitCode=0 Mar 20 14:14:02 crc kubenswrapper[4895]: I0320 14:14:02.826000 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-jpprb" event={"ID":"0dbb5ed5-242f-4d8c-afe6-124d33991e31","Type":"ContainerDied","Data":"490d8d9c4be147598efeeac037e1926dff50769bfb47405f7a94380608a5e03b"} Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.236279 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.370507 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kjbx\" (UniqueName: \"kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx\") pod \"0dbb5ed5-242f-4d8c-afe6-124d33991e31\" (UID: \"0dbb5ed5-242f-4d8c-afe6-124d33991e31\") " Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.384758 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx" (OuterVolumeSpecName: "kube-api-access-2kjbx") pod "0dbb5ed5-242f-4d8c-afe6-124d33991e31" (UID: "0dbb5ed5-242f-4d8c-afe6-124d33991e31"). InnerVolumeSpecName "kube-api-access-2kjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.474521 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kjbx\" (UniqueName: \"kubernetes.io/projected/0dbb5ed5-242f-4d8c-afe6-124d33991e31-kube-api-access-2kjbx\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.844014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-jpprb" event={"ID":"0dbb5ed5-242f-4d8c-afe6-124d33991e31","Type":"ContainerDied","Data":"792afcf0f8d26ea8b681056ffa7964fe9ac1bcc1bc3c9d5643c38a19a47c6291"} Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.844060 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="792afcf0f8d26ea8b681056ffa7964fe9ac1bcc1bc3c9d5643c38a19a47c6291" Mar 20 14:14:04 crc kubenswrapper[4895]: I0320 14:14:04.844219 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-jpprb" Mar 20 14:14:05 crc kubenswrapper[4895]: I0320 14:14:05.320633 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-bcvc2"] Mar 20 14:14:05 crc kubenswrapper[4895]: I0320 14:14:05.330514 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-bcvc2"] Mar 20 14:14:07 crc kubenswrapper[4895]: I0320 14:14:07.211919 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:14:07 crc kubenswrapper[4895]: E0320 14:14:07.212548 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:14:07 crc kubenswrapper[4895]: I0320 14:14:07.224322 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5" path="/var/lib/kubelet/pods/7b4e72e7-4efc-4ef4-b52f-a4bbfc3842b5/volumes" Mar 20 14:14:21 crc kubenswrapper[4895]: I0320 14:14:21.727253 4895 scope.go:117] "RemoveContainer" containerID="db3aee303cc8cbc64ac7ab6bfa12e72c70251487cd335be85b0f88a2e70687eb" Mar 20 14:14:22 crc kubenswrapper[4895]: I0320 14:14:22.212827 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:14:22 crc kubenswrapper[4895]: E0320 14:14:22.213432 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:14:34 crc kubenswrapper[4895]: I0320 14:14:34.212937 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:14:34 crc kubenswrapper[4895]: E0320 14:14:34.213947 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:14:45 crc kubenswrapper[4895]: I0320 14:14:45.212967 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:14:45 crc kubenswrapper[4895]: E0320 14:14:45.213924 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:14:58 crc kubenswrapper[4895]: I0320 14:14:58.211668 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:14:58 crc kubenswrapper[4895]: E0320 14:14:58.212535 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.145845 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp"] Mar 20 14:15:00 crc kubenswrapper[4895]: E0320 14:15:00.146274 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbb5ed5-242f-4d8c-afe6-124d33991e31" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.146286 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbb5ed5-242f-4d8c-afe6-124d33991e31" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.146505 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbb5ed5-242f-4d8c-afe6-124d33991e31" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.147235 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.149553 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.151495 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.157170 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp"] Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.260087 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.260520 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fszgg\" (UniqueName: \"kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.260570 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.362346 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.362669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fszgg\" (UniqueName: \"kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.362785 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.363645 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.372084 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.380718 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fszgg\" (UniqueName: \"kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg\") pod \"collect-profiles-29566935-f6mtp\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:00 crc kubenswrapper[4895]: I0320 14:15:00.490442 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:01 crc kubenswrapper[4895]: I0320 14:15:01.109625 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp"] Mar 20 14:15:01 crc kubenswrapper[4895]: I0320 14:15:01.377989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" event={"ID":"fc8fa4f7-2f7c-455a-95d7-e565d73e27be","Type":"ContainerStarted","Data":"f6924a1c34efee8240ddd36fe0b56be31dc7eee4777e7393d04ce966bbfd7c4d"} Mar 20 14:15:01 crc kubenswrapper[4895]: I0320 14:15:01.378371 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" event={"ID":"fc8fa4f7-2f7c-455a-95d7-e565d73e27be","Type":"ContainerStarted","Data":"1ff2af61b91c3a7555cba7854e53ae13643343d6ad71495908b267ca394b7bcc"} Mar 20 14:15:02 crc kubenswrapper[4895]: I0320 14:15:02.388132 4895 generic.go:334] "Generic (PLEG): container finished" podID="fc8fa4f7-2f7c-455a-95d7-e565d73e27be" containerID="f6924a1c34efee8240ddd36fe0b56be31dc7eee4777e7393d04ce966bbfd7c4d" exitCode=0 Mar 20 14:15:02 crc kubenswrapper[4895]: I0320 14:15:02.388184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" event={"ID":"fc8fa4f7-2f7c-455a-95d7-e565d73e27be","Type":"ContainerDied","Data":"f6924a1c34efee8240ddd36fe0b56be31dc7eee4777e7393d04ce966bbfd7c4d"} Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.786860 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.939737 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fszgg\" (UniqueName: \"kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg\") pod \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.939848 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume\") pod \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.940129 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume\") pod \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\" (UID: \"fc8fa4f7-2f7c-455a-95d7-e565d73e27be\") " Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.940568 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc8fa4f7-2f7c-455a-95d7-e565d73e27be" (UID: "fc8fa4f7-2f7c-455a-95d7-e565d73e27be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.945921 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc8fa4f7-2f7c-455a-95d7-e565d73e27be" (UID: "fc8fa4f7-2f7c-455a-95d7-e565d73e27be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4895]: I0320 14:15:03.946512 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg" (OuterVolumeSpecName: "kube-api-access-fszgg") pod "fc8fa4f7-2f7c-455a-95d7-e565d73e27be" (UID: "fc8fa4f7-2f7c-455a-95d7-e565d73e27be"). InnerVolumeSpecName "kube-api-access-fszgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.043048 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.043161 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fszgg\" (UniqueName: \"kubernetes.io/projected/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-kube-api-access-fszgg\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.043177 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc8fa4f7-2f7c-455a-95d7-e565d73e27be-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.312200 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl"] Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.323464 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nkfwl"] Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.406592 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" event={"ID":"fc8fa4f7-2f7c-455a-95d7-e565d73e27be","Type":"ContainerDied","Data":"1ff2af61b91c3a7555cba7854e53ae13643343d6ad71495908b267ca394b7bcc"} Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.406629 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff2af61b91c3a7555cba7854e53ae13643343d6ad71495908b267ca394b7bcc" Mar 20 14:15:04 crc kubenswrapper[4895]: I0320 14:15:04.406687 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-f6mtp" Mar 20 14:15:05 crc kubenswrapper[4895]: I0320 14:15:05.223115 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc54edb-e159-48f3-8c25-cc714a7ab3a5" path="/var/lib/kubelet/pods/8fc54edb-e159-48f3-8c25-cc714a7ab3a5/volumes" Mar 20 14:15:09 crc kubenswrapper[4895]: I0320 14:15:09.212170 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:15:09 crc kubenswrapper[4895]: E0320 14:15:09.212450 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:15:20 crc kubenswrapper[4895]: I0320 14:15:20.211777 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:15:20 crc kubenswrapper[4895]: E0320 14:15:20.212565 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:15:21 crc kubenswrapper[4895]: I0320 14:15:21.810994 4895 scope.go:117] "RemoveContainer" containerID="149b13089bd0115b836c1af3981aea33aaa7ac022ee7bdde63733d22ca27509f" Mar 20 14:15:33 crc kubenswrapper[4895]: I0320 14:15:33.211986 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:15:33 crc kubenswrapper[4895]: E0320 14:15:33.212930 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:15:48 crc kubenswrapper[4895]: I0320 14:15:48.211998 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:15:48 crc kubenswrapper[4895]: E0320 14:15:48.212734 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.157054 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566936-ch8pz"] Mar 20 14:16:00 crc kubenswrapper[4895]: E0320 14:16:00.158015 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8fa4f7-2f7c-455a-95d7-e565d73e27be" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.158028 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8fa4f7-2f7c-455a-95d7-e565d73e27be" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.158251 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8fa4f7-2f7c-455a-95d7-e565d73e27be" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.159088 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.161546 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.161670 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.161745 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.173066 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-ch8pz"] Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.220226 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr8jt\" (UniqueName: \"kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt\") pod \"auto-csr-approver-29566936-ch8pz\" (UID: \"5fc568a3-92a8-40ab-9fe2-da6100500652\") " pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.322348 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr8jt\" (UniqueName: \"kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt\") pod \"auto-csr-approver-29566936-ch8pz\" (UID: \"5fc568a3-92a8-40ab-9fe2-da6100500652\") " pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.347591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr8jt\" (UniqueName: \"kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt\") pod \"auto-csr-approver-29566936-ch8pz\" (UID: \"5fc568a3-92a8-40ab-9fe2-da6100500652\") " pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.484228 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.931971 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-ch8pz"] Mar 20 14:16:00 crc kubenswrapper[4895]: I0320 14:16:00.937451 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:16:01 crc kubenswrapper[4895]: I0320 14:16:01.012027 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" event={"ID":"5fc568a3-92a8-40ab-9fe2-da6100500652","Type":"ContainerStarted","Data":"7df5abfe2cd35824bff0623cc2cc42b70e73bb09b1b88760fd313193cda5ed19"} Mar 20 14:16:01 crc kubenswrapper[4895]: I0320 14:16:01.219993 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:16:01 crc kubenswrapper[4895]: E0320 14:16:01.220278 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:03 crc kubenswrapper[4895]: I0320 14:16:03.031864 4895 generic.go:334] "Generic (PLEG): container finished" podID="5fc568a3-92a8-40ab-9fe2-da6100500652" containerID="c901bebdeb73549a39593e48b120e2b2d1d4f10a3e013001edbf7e88cc27a93e" exitCode=0 Mar 20 14:16:03 crc kubenswrapper[4895]: I0320 14:16:03.031959 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" event={"ID":"5fc568a3-92a8-40ab-9fe2-da6100500652","Type":"ContainerDied","Data":"c901bebdeb73549a39593e48b120e2b2d1d4f10a3e013001edbf7e88cc27a93e"} Mar 20 14:16:04 crc kubenswrapper[4895]: I0320 14:16:04.472059 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:04 crc kubenswrapper[4895]: I0320 14:16:04.505490 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr8jt\" (UniqueName: \"kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt\") pod \"5fc568a3-92a8-40ab-9fe2-da6100500652\" (UID: \"5fc568a3-92a8-40ab-9fe2-da6100500652\") " Mar 20 14:16:04 crc kubenswrapper[4895]: I0320 14:16:04.514717 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt" (OuterVolumeSpecName: "kube-api-access-jr8jt") pod "5fc568a3-92a8-40ab-9fe2-da6100500652" (UID: "5fc568a3-92a8-40ab-9fe2-da6100500652"). InnerVolumeSpecName "kube-api-access-jr8jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:16:04 crc kubenswrapper[4895]: I0320 14:16:04.606939 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr8jt\" (UniqueName: \"kubernetes.io/projected/5fc568a3-92a8-40ab-9fe2-da6100500652-kube-api-access-jr8jt\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:05 crc kubenswrapper[4895]: I0320 14:16:05.052610 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" event={"ID":"5fc568a3-92a8-40ab-9fe2-da6100500652","Type":"ContainerDied","Data":"7df5abfe2cd35824bff0623cc2cc42b70e73bb09b1b88760fd313193cda5ed19"} Mar 20 14:16:05 crc kubenswrapper[4895]: I0320 14:16:05.053118 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df5abfe2cd35824bff0623cc2cc42b70e73bb09b1b88760fd313193cda5ed19" Mar 20 14:16:05 crc kubenswrapper[4895]: I0320 14:16:05.052715 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-ch8pz" Mar 20 14:16:05 crc kubenswrapper[4895]: I0320 14:16:05.555979 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-prcw4"] Mar 20 14:16:05 crc kubenswrapper[4895]: I0320 14:16:05.572084 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-prcw4"] Mar 20 14:16:07 crc kubenswrapper[4895]: I0320 14:16:07.236618 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff5611f-af0e-414c-87e2-0371885d6e96" path="/var/lib/kubelet/pods/9ff5611f-af0e-414c-87e2-0371885d6e96/volumes" Mar 20 14:16:12 crc kubenswrapper[4895]: I0320 14:16:12.212081 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:16:12 crc kubenswrapper[4895]: E0320 14:16:12.212815 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:21 crc kubenswrapper[4895]: I0320 14:16:21.876100 4895 scope.go:117] "RemoveContainer" containerID="4ac35a73d03b76637764d932718c8b591ed6d0c9c99346cdd1d48fa6420dbc58" Mar 20 14:16:23 crc kubenswrapper[4895]: I0320 14:16:23.211576 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:16:23 crc kubenswrapper[4895]: E0320 14:16:23.212091 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:36 crc kubenswrapper[4895]: I0320 14:16:36.212615 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:16:36 crc kubenswrapper[4895]: E0320 14:16:36.213442 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.163156 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:46 crc kubenswrapper[4895]: E0320 14:16:46.164323 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc568a3-92a8-40ab-9fe2-da6100500652" containerName="oc" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.164341 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc568a3-92a8-40ab-9fe2-da6100500652" containerName="oc" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.164607 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc568a3-92a8-40ab-9fe2-da6100500652" containerName="oc" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.166135 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.176024 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.266245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.266420 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqhs\" (UniqueName: \"kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.266465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.368243 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqhs\" (UniqueName: \"kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.368345 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.368439 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.369861 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.370089 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.388839 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqhs\" (UniqueName: \"kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs\") pod \"certified-operators-2jvbh\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:46 crc kubenswrapper[4895]: I0320 14:16:46.500475 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:47 crc kubenswrapper[4895]: I0320 14:16:47.000862 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:47 crc kubenswrapper[4895]: I0320 14:16:47.437106 4895 generic.go:334] "Generic (PLEG): container finished" podID="c977e557-4f28-41a5-861c-05498110be31" containerID="b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2" exitCode=0 Mar 20 14:16:47 crc kubenswrapper[4895]: I0320 14:16:47.437213 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerDied","Data":"b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2"} Mar 20 14:16:47 crc kubenswrapper[4895]: I0320 14:16:47.437418 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerStarted","Data":"509c63ef651a6134b1ba769b81bb472c62e1799b24123f1fd1d6bd2480b7d8e2"} Mar 20 14:16:48 crc kubenswrapper[4895]: I0320 14:16:48.450134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerStarted","Data":"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511"} Mar 20 14:16:49 crc kubenswrapper[4895]: I0320 14:16:49.463069 4895 generic.go:334] "Generic (PLEG): container finished" podID="c977e557-4f28-41a5-861c-05498110be31" containerID="e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511" exitCode=0 Mar 20 14:16:49 crc kubenswrapper[4895]: I0320 14:16:49.463110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerDied","Data":"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511"} Mar 20 14:16:50 crc kubenswrapper[4895]: I0320 14:16:50.212239 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:16:50 crc kubenswrapper[4895]: E0320 14:16:50.212866 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:16:50 crc kubenswrapper[4895]: I0320 14:16:50.475223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerStarted","Data":"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c"} Mar 20 14:16:50 crc kubenswrapper[4895]: I0320 14:16:50.498132 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2jvbh" podStartSLOduration=1.917862461 podStartE2EDuration="4.498109831s" podCreationTimestamp="2026-03-20 14:16:46 +0000 UTC" firstStartedPulling="2026-03-20 14:16:47.439335944 +0000 UTC m=+3306.949054920" lastFinishedPulling="2026-03-20 14:16:50.019583324 +0000 UTC m=+3309.529302290" observedRunningTime="2026-03-20 14:16:50.490748555 +0000 UTC m=+3310.000467521" watchObservedRunningTime="2026-03-20 14:16:50.498109831 +0000 UTC m=+3310.007828797" Mar 20 14:16:56 crc kubenswrapper[4895]: I0320 14:16:56.501129 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:56 crc kubenswrapper[4895]: I0320 14:16:56.502729 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:56 crc kubenswrapper[4895]: I0320 14:16:56.565054 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:56 crc kubenswrapper[4895]: I0320 14:16:56.611343 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:56 crc kubenswrapper[4895]: I0320 14:16:56.800806 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:58 crc kubenswrapper[4895]: I0320 14:16:58.555792 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2jvbh" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="registry-server" containerID="cri-o://f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c" gracePeriod=2 Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.138935 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.232418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content\") pod \"c977e557-4f28-41a5-861c-05498110be31\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.232960 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdqhs\" (UniqueName: \"kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs\") pod \"c977e557-4f28-41a5-861c-05498110be31\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.232992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities\") pod \"c977e557-4f28-41a5-861c-05498110be31\" (UID: \"c977e557-4f28-41a5-861c-05498110be31\") " Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.234635 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities" (OuterVolumeSpecName: "utilities") pod "c977e557-4f28-41a5-861c-05498110be31" (UID: "c977e557-4f28-41a5-861c-05498110be31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.240600 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs" (OuterVolumeSpecName: "kube-api-access-wdqhs") pod "c977e557-4f28-41a5-861c-05498110be31" (UID: "c977e557-4f28-41a5-861c-05498110be31"). InnerVolumeSpecName "kube-api-access-wdqhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.335842 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdqhs\" (UniqueName: \"kubernetes.io/projected/c977e557-4f28-41a5-861c-05498110be31-kube-api-access-wdqhs\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.335873 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.496250 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c977e557-4f28-41a5-861c-05498110be31" (UID: "c977e557-4f28-41a5-861c-05498110be31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.539592 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c977e557-4f28-41a5-861c-05498110be31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.570923 4895 generic.go:334] "Generic (PLEG): container finished" podID="c977e557-4f28-41a5-861c-05498110be31" containerID="f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c" exitCode=0 Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.570966 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerDied","Data":"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c"} Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.570991 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jvbh" event={"ID":"c977e557-4f28-41a5-861c-05498110be31","Type":"ContainerDied","Data":"509c63ef651a6134b1ba769b81bb472c62e1799b24123f1fd1d6bd2480b7d8e2"} Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.571012 4895 scope.go:117] "RemoveContainer" containerID="f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.571131 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jvbh" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.609869 4895 scope.go:117] "RemoveContainer" containerID="e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.613119 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.625063 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2jvbh"] Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.635510 4895 scope.go:117] "RemoveContainer" containerID="b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.696134 4895 scope.go:117] "RemoveContainer" containerID="f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c" Mar 20 14:16:59 crc kubenswrapper[4895]: E0320 14:16:59.696597 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c\": container with ID starting with f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c not found: ID does not exist" containerID="f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.696631 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c"} err="failed to get container status \"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c\": rpc error: code = NotFound desc = could not find container \"f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c\": container with ID starting with f2deee5a5c8e365e66f801eb44e3733b6ec2ebc94665f49120dbda744ce7431c not found: ID does not exist" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.696651 4895 scope.go:117] "RemoveContainer" containerID="e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511" Mar 20 14:16:59 crc kubenswrapper[4895]: E0320 14:16:59.697214 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511\": container with ID starting with e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511 not found: ID does not exist" containerID="e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.697240 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511"} err="failed to get container status \"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511\": rpc error: code = NotFound desc = could not find container \"e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511\": container with ID starting with e70ebe138d8e02548952f53a795b3b2137b0bb892a7ccd78d41b146fa65fd511 not found: ID does not exist" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.697254 4895 scope.go:117] "RemoveContainer" containerID="b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2" Mar 20 14:16:59 crc kubenswrapper[4895]: E0320 14:16:59.697586 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2\": container with ID starting with b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2 not found: ID does not exist" containerID="b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2" Mar 20 14:16:59 crc kubenswrapper[4895]: I0320 14:16:59.697620 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2"} err="failed to get container status \"b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2\": rpc error: code = NotFound desc = could not find container \"b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2\": container with ID starting with b895a78b681d8bb114de97c0579b4ee67fdf1e5eeafdb9d8cf5765d05f6299e2 not found: ID does not exist" Mar 20 14:17:01 crc kubenswrapper[4895]: I0320 14:17:01.246714 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c977e557-4f28-41a5-861c-05498110be31" path="/var/lib/kubelet/pods/c977e557-4f28-41a5-861c-05498110be31/volumes" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.644654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:04 crc kubenswrapper[4895]: E0320 14:17:04.645621 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="registry-server" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.645638 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="registry-server" Mar 20 14:17:04 crc kubenswrapper[4895]: E0320 14:17:04.645693 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="extract-utilities" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.645700 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="extract-utilities" Mar 20 14:17:04 crc kubenswrapper[4895]: E0320 14:17:04.645712 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="extract-content" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.645718 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="extract-content" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.645937 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c977e557-4f28-41a5-861c-05498110be31" containerName="registry-server" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.648275 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.666411 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.744210 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbwm\" (UniqueName: \"kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.744327 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.744352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.846465 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbwm\" (UniqueName: \"kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.846617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.846640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.847132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.847204 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.865038 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbwm\" (UniqueName: \"kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm\") pod \"redhat-operators-fr4sx\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:04 crc kubenswrapper[4895]: I0320 14:17:04.970179 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:05 crc kubenswrapper[4895]: I0320 14:17:05.212123 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:17:05 crc kubenswrapper[4895]: W0320 14:17:05.542587 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517266b7_0eff_4cf2_84f9_65451ff2816c.slice/crio-be4f522cba81e3d2beb375cc45cdc1787fff615bec67fc9c5e88a150d6f4115c WatchSource:0}: Error finding container be4f522cba81e3d2beb375cc45cdc1787fff615bec67fc9c5e88a150d6f4115c: Status 404 returned error can't find the container with id be4f522cba81e3d2beb375cc45cdc1787fff615bec67fc9c5e88a150d6f4115c Mar 20 14:17:05 crc kubenswrapper[4895]: I0320 14:17:05.559955 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:05 crc kubenswrapper[4895]: I0320 14:17:05.631664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerStarted","Data":"be4f522cba81e3d2beb375cc45cdc1787fff615bec67fc9c5e88a150d6f4115c"} Mar 20 14:17:05 crc kubenswrapper[4895]: I0320 14:17:05.636992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351"} Mar 20 14:17:06 crc kubenswrapper[4895]: I0320 14:17:06.649527 4895 generic.go:334] "Generic (PLEG): container finished" podID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerID="3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f" exitCode=0 Mar 20 14:17:06 crc kubenswrapper[4895]: I0320 14:17:06.649680 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerDied","Data":"3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f"} Mar 20 14:17:07 crc kubenswrapper[4895]: I0320 14:17:07.660129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerStarted","Data":"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64"} Mar 20 14:17:12 crc kubenswrapper[4895]: I0320 14:17:12.708654 4895 generic.go:334] "Generic (PLEG): container finished" podID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerID="0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64" exitCode=0 Mar 20 14:17:12 crc kubenswrapper[4895]: I0320 14:17:12.708745 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerDied","Data":"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64"} Mar 20 14:17:13 crc kubenswrapper[4895]: I0320 14:17:13.723992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerStarted","Data":"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32"} Mar 20 14:17:13 crc kubenswrapper[4895]: I0320 14:17:13.746022 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fr4sx" podStartSLOduration=3.27748455 podStartE2EDuration="9.746003915s" podCreationTimestamp="2026-03-20 14:17:04 +0000 UTC" firstStartedPulling="2026-03-20 14:17:06.659783698 +0000 UTC m=+3326.169502664" lastFinishedPulling="2026-03-20 14:17:13.128303063 +0000 UTC m=+3332.638022029" observedRunningTime="2026-03-20 14:17:13.743179677 +0000 UTC m=+3333.252898643" watchObservedRunningTime="2026-03-20 14:17:13.746003915 +0000 UTC m=+3333.255722881" Mar 20 14:17:14 crc kubenswrapper[4895]: I0320 14:17:14.971302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:14 crc kubenswrapper[4895]: I0320 14:17:14.971633 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:16 crc kubenswrapper[4895]: I0320 14:17:16.046996 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fr4sx" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" probeResult="failure" output=< Mar 20 14:17:16 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:17:16 crc kubenswrapper[4895]: > Mar 20 14:17:26 crc kubenswrapper[4895]: I0320 14:17:26.019834 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fr4sx" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" probeResult="failure" output=< Mar 20 14:17:26 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:17:26 crc kubenswrapper[4895]: > Mar 20 14:17:35 crc kubenswrapper[4895]: I0320 14:17:35.018298 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:35 crc kubenswrapper[4895]: I0320 14:17:35.077086 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:35 crc kubenswrapper[4895]: I0320 14:17:35.845657 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:36 crc kubenswrapper[4895]: I0320 14:17:36.923148 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fr4sx" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" containerID="cri-o://b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32" gracePeriod=2 Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.531252 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.668301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities\") pod \"517266b7-0eff-4cf2-84f9-65451ff2816c\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.668660 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbwm\" (UniqueName: \"kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm\") pod \"517266b7-0eff-4cf2-84f9-65451ff2816c\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.669045 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content\") pod \"517266b7-0eff-4cf2-84f9-65451ff2816c\" (UID: \"517266b7-0eff-4cf2-84f9-65451ff2816c\") " Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.670565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities" (OuterVolumeSpecName: "utilities") pod "517266b7-0eff-4cf2-84f9-65451ff2816c" (UID: "517266b7-0eff-4cf2-84f9-65451ff2816c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.674681 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm" (OuterVolumeSpecName: "kube-api-access-4wbwm") pod "517266b7-0eff-4cf2-84f9-65451ff2816c" (UID: "517266b7-0eff-4cf2-84f9-65451ff2816c"). InnerVolumeSpecName "kube-api-access-4wbwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.771263 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.771306 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbwm\" (UniqueName: \"kubernetes.io/projected/517266b7-0eff-4cf2-84f9-65451ff2816c-kube-api-access-4wbwm\") on node \"crc\" DevicePath \"\"" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.800246 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "517266b7-0eff-4cf2-84f9-65451ff2816c" (UID: "517266b7-0eff-4cf2-84f9-65451ff2816c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.872506 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517266b7-0eff-4cf2-84f9-65451ff2816c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.934991 4895 generic.go:334] "Generic (PLEG): container finished" podID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerID="b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32" exitCode=0 Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.935038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerDied","Data":"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32"} Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.935069 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fr4sx" event={"ID":"517266b7-0eff-4cf2-84f9-65451ff2816c","Type":"ContainerDied","Data":"be4f522cba81e3d2beb375cc45cdc1787fff615bec67fc9c5e88a150d6f4115c"} Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.935091 4895 scope.go:117] "RemoveContainer" containerID="b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.935140 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fr4sx" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.957638 4895 scope.go:117] "RemoveContainer" containerID="0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64" Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.966499 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.975805 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fr4sx"] Mar 20 14:17:37 crc kubenswrapper[4895]: I0320 14:17:37.996347 4895 scope.go:117] "RemoveContainer" containerID="3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.032888 4895 scope.go:117] "RemoveContainer" containerID="b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32" Mar 20 14:17:38 crc kubenswrapper[4895]: E0320 14:17:38.033370 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32\": container with ID starting with b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32 not found: ID does not exist" containerID="b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.033426 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32"} err="failed to get container status \"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32\": rpc error: code = NotFound desc = could not find container \"b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32\": container with ID starting with b14a057cf43c2f496a2684fac41fab27437ac74dce723731498aeed36238ce32 not found: ID does not exist" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.033451 4895 scope.go:117] "RemoveContainer" containerID="0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64" Mar 20 14:17:38 crc kubenswrapper[4895]: E0320 14:17:38.033894 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64\": container with ID starting with 0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64 not found: ID does not exist" containerID="0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.033942 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64"} err="failed to get container status \"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64\": rpc error: code = NotFound desc = could not find container \"0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64\": container with ID starting with 0562c3881f364cd2277a81f2e141c580872a3abd9589caaff265882123119f64 not found: ID does not exist" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.033976 4895 scope.go:117] "RemoveContainer" containerID="3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f" Mar 20 14:17:38 crc kubenswrapper[4895]: E0320 14:17:38.034352 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f\": container with ID starting with 3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f not found: ID does not exist" containerID="3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f" Mar 20 14:17:38 crc kubenswrapper[4895]: I0320 14:17:38.034384 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f"} err="failed to get container status \"3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f\": rpc error: code = NotFound desc = could not find container \"3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f\": container with ID starting with 3e8dffb07c55f2bb94c54387f4a1dc31e72fbe6a1fc3868104cdf23d83c6022f not found: ID does not exist" Mar 20 14:17:39 crc kubenswrapper[4895]: I0320 14:17:39.228201 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" path="/var/lib/kubelet/pods/517266b7-0eff-4cf2-84f9-65451ff2816c/volumes" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.144163 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566938-gp7bk"] Mar 20 14:18:00 crc kubenswrapper[4895]: E0320 14:18:00.145121 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="extract-content" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.145135 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="extract-content" Mar 20 14:18:00 crc kubenswrapper[4895]: E0320 14:18:00.145169 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.145174 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" Mar 20 14:18:00 crc kubenswrapper[4895]: E0320 14:18:00.145190 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="extract-utilities" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.145196 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="extract-utilities" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.145377 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="517266b7-0eff-4cf2-84f9-65451ff2816c" containerName="registry-server" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.146072 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.148655 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.149414 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.149594 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.153945 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-gp7bk"] Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.336139 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvtc\" (UniqueName: \"kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc\") pod \"auto-csr-approver-29566938-gp7bk\" (UID: \"3094ddb0-fb66-43a1-8e8a-da51248c25ce\") " pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.438226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvtc\" (UniqueName: \"kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc\") pod \"auto-csr-approver-29566938-gp7bk\" (UID: \"3094ddb0-fb66-43a1-8e8a-da51248c25ce\") " pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.457197 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvtc\" (UniqueName: \"kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc\") pod \"auto-csr-approver-29566938-gp7bk\" (UID: \"3094ddb0-fb66-43a1-8e8a-da51248c25ce\") " pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.502133 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:00 crc kubenswrapper[4895]: I0320 14:18:00.961768 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-gp7bk"] Mar 20 14:18:01 crc kubenswrapper[4895]: I0320 14:18:01.170049 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" event={"ID":"3094ddb0-fb66-43a1-8e8a-da51248c25ce","Type":"ContainerStarted","Data":"1c648d2aa4ce15eb4e7f6ac7d520db6a19975328f15bdbe7c4f257b39676b17d"} Mar 20 14:18:03 crc kubenswrapper[4895]: I0320 14:18:03.198189 4895 generic.go:334] "Generic (PLEG): container finished" podID="3094ddb0-fb66-43a1-8e8a-da51248c25ce" containerID="19f580266f7f00e77ce0f46a8abaf9c8316752c400e7455f4bc4576e0aa06b5d" exitCode=0 Mar 20 14:18:03 crc kubenswrapper[4895]: I0320 14:18:03.198297 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" event={"ID":"3094ddb0-fb66-43a1-8e8a-da51248c25ce","Type":"ContainerDied","Data":"19f580266f7f00e77ce0f46a8abaf9c8316752c400e7455f4bc4576e0aa06b5d"} Mar 20 14:18:04 crc kubenswrapper[4895]: I0320 14:18:04.596230 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:04 crc kubenswrapper[4895]: I0320 14:18:04.725695 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdvtc\" (UniqueName: \"kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc\") pod \"3094ddb0-fb66-43a1-8e8a-da51248c25ce\" (UID: \"3094ddb0-fb66-43a1-8e8a-da51248c25ce\") " Mar 20 14:18:04 crc kubenswrapper[4895]: I0320 14:18:04.730245 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc" (OuterVolumeSpecName: "kube-api-access-vdvtc") pod "3094ddb0-fb66-43a1-8e8a-da51248c25ce" (UID: "3094ddb0-fb66-43a1-8e8a-da51248c25ce"). InnerVolumeSpecName "kube-api-access-vdvtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:18:04 crc kubenswrapper[4895]: I0320 14:18:04.828869 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdvtc\" (UniqueName: \"kubernetes.io/projected/3094ddb0-fb66-43a1-8e8a-da51248c25ce-kube-api-access-vdvtc\") on node \"crc\" DevicePath \"\"" Mar 20 14:18:05 crc kubenswrapper[4895]: I0320 14:18:05.220416 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" Mar 20 14:18:05 crc kubenswrapper[4895]: I0320 14:18:05.225308 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-gp7bk" event={"ID":"3094ddb0-fb66-43a1-8e8a-da51248c25ce","Type":"ContainerDied","Data":"1c648d2aa4ce15eb4e7f6ac7d520db6a19975328f15bdbe7c4f257b39676b17d"} Mar 20 14:18:05 crc kubenswrapper[4895]: I0320 14:18:05.225354 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c648d2aa4ce15eb4e7f6ac7d520db6a19975328f15bdbe7c4f257b39676b17d" Mar 20 14:18:05 crc kubenswrapper[4895]: I0320 14:18:05.664261 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xzzs8"] Mar 20 14:18:05 crc kubenswrapper[4895]: I0320 14:18:05.674596 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xzzs8"] Mar 20 14:18:07 crc kubenswrapper[4895]: I0320 14:18:07.224991 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5616ecdf-c26d-4560-81ff-dcaa5dc5f395" path="/var/lib/kubelet/pods/5616ecdf-c26d-4560-81ff-dcaa5dc5f395/volumes" Mar 20 14:18:21 crc kubenswrapper[4895]: I0320 14:18:21.992703 4895 scope.go:117] "RemoveContainer" containerID="ac813b9990ca5a8cfd50eab2e9d44d04b4a5dbe47585781be8f826f56141e58c" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.300954 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:18:55 crc kubenswrapper[4895]: E0320 14:18:55.302501 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3094ddb0-fb66-43a1-8e8a-da51248c25ce" containerName="oc" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.302522 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3094ddb0-fb66-43a1-8e8a-da51248c25ce" containerName="oc" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.302966 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3094ddb0-fb66-43a1-8e8a-da51248c25ce" containerName="oc" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.307610 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.314461 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.447860 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcg7\" (UniqueName: \"kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.448031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.448310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.552878 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.552950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcg7\" (UniqueName: \"kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.552990 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.553464 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.553516 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.579833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcg7\" (UniqueName: \"kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7\") pod \"community-operators-4p5lm\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:55 crc kubenswrapper[4895]: I0320 14:18:55.634407 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:18:56 crc kubenswrapper[4895]: I0320 14:18:56.102743 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:18:56 crc kubenswrapper[4895]: I0320 14:18:56.742680 4895 generic.go:334] "Generic (PLEG): container finished" podID="bfeb694c-7cb8-4949-863d-bb135058a418" containerID="084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51" exitCode=0 Mar 20 14:18:56 crc kubenswrapper[4895]: I0320 14:18:56.742748 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerDied","Data":"084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51"} Mar 20 14:18:56 crc kubenswrapper[4895]: I0320 14:18:56.743070 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerStarted","Data":"198ef48fa0644275b18899c2043d13997bc97171a9f35c298a3d04268c014670"} Mar 20 14:18:58 crc kubenswrapper[4895]: I0320 14:18:58.765009 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerStarted","Data":"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522"} Mar 20 14:18:59 crc kubenswrapper[4895]: I0320 14:18:59.775653 4895 generic.go:334] "Generic (PLEG): container finished" podID="bfeb694c-7cb8-4949-863d-bb135058a418" containerID="df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522" exitCode=0 Mar 20 14:18:59 crc kubenswrapper[4895]: I0320 14:18:59.775729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerDied","Data":"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522"} Mar 20 14:19:01 crc kubenswrapper[4895]: I0320 14:19:01.804109 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerStarted","Data":"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da"} Mar 20 14:19:01 crc kubenswrapper[4895]: I0320 14:19:01.828333 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4p5lm" podStartSLOduration=2.979664936 podStartE2EDuration="6.828311085s" podCreationTimestamp="2026-03-20 14:18:55 +0000 UTC" firstStartedPulling="2026-03-20 14:18:56.744701733 +0000 UTC m=+3436.254420709" lastFinishedPulling="2026-03-20 14:19:00.593347892 +0000 UTC m=+3440.103066858" observedRunningTime="2026-03-20 14:19:01.822963298 +0000 UTC m=+3441.332682264" watchObservedRunningTime="2026-03-20 14:19:01.828311085 +0000 UTC m=+3441.338030051" Mar 20 14:19:05 crc kubenswrapper[4895]: I0320 14:19:05.634923 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:05 crc kubenswrapper[4895]: I0320 14:19:05.635632 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:05 crc kubenswrapper[4895]: I0320 14:19:05.722911 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:05 crc kubenswrapper[4895]: I0320 14:19:05.895063 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:06 crc kubenswrapper[4895]: I0320 14:19:06.888184 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:19:07 crc kubenswrapper[4895]: I0320 14:19:07.868770 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4p5lm" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="registry-server" containerID="cri-o://a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da" gracePeriod=2 Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.353850 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.532826 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwcg7\" (UniqueName: \"kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7\") pod \"bfeb694c-7cb8-4949-863d-bb135058a418\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.532925 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content\") pod \"bfeb694c-7cb8-4949-863d-bb135058a418\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.545858 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities\") pod \"bfeb694c-7cb8-4949-863d-bb135058a418\" (UID: \"bfeb694c-7cb8-4949-863d-bb135058a418\") " Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.548303 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities" (OuterVolumeSpecName: "utilities") pod "bfeb694c-7cb8-4949-863d-bb135058a418" (UID: "bfeb694c-7cb8-4949-863d-bb135058a418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.582614 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7" (OuterVolumeSpecName: "kube-api-access-kwcg7") pod "bfeb694c-7cb8-4949-863d-bb135058a418" (UID: "bfeb694c-7cb8-4949-863d-bb135058a418"). InnerVolumeSpecName "kube-api-access-kwcg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.589156 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfeb694c-7cb8-4949-863d-bb135058a418" (UID: "bfeb694c-7cb8-4949-863d-bb135058a418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.649540 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.649582 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwcg7\" (UniqueName: \"kubernetes.io/projected/bfeb694c-7cb8-4949-863d-bb135058a418-kube-api-access-kwcg7\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.649596 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfeb694c-7cb8-4949-863d-bb135058a418-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.882845 4895 generic.go:334] "Generic (PLEG): container finished" podID="bfeb694c-7cb8-4949-863d-bb135058a418" containerID="a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da" exitCode=0 Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.882914 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4p5lm" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.882917 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerDied","Data":"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da"} Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.883432 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4p5lm" event={"ID":"bfeb694c-7cb8-4949-863d-bb135058a418","Type":"ContainerDied","Data":"198ef48fa0644275b18899c2043d13997bc97171a9f35c298a3d04268c014670"} Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.883476 4895 scope.go:117] "RemoveContainer" containerID="a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.913255 4895 scope.go:117] "RemoveContainer" containerID="df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.924676 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.934621 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4p5lm"] Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.939616 4895 scope.go:117] "RemoveContainer" containerID="084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.982365 4895 scope.go:117] "RemoveContainer" containerID="a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da" Mar 20 14:19:08 crc kubenswrapper[4895]: E0320 14:19:08.982857 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da\": container with ID starting with a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da not found: ID does not exist" containerID="a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.982910 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da"} err="failed to get container status \"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da\": rpc error: code = NotFound desc = could not find container \"a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da\": container with ID starting with a0f43eaf99264d1a24c035668147a925b54d57dd6352028cb8b58afe49ba88da not found: ID does not exist" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.982939 4895 scope.go:117] "RemoveContainer" containerID="df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522" Mar 20 14:19:08 crc kubenswrapper[4895]: E0320 14:19:08.983321 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522\": container with ID starting with df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522 not found: ID does not exist" containerID="df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.983342 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522"} err="failed to get container status \"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522\": rpc error: code = NotFound desc = could not find container \"df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522\": container with ID starting with df9759e7a0c15f1b76051374f6cafdde0a09a0146b606a8bfefdbdf28a5c9522 not found: ID does not exist" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.983354 4895 scope.go:117] "RemoveContainer" containerID="084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51" Mar 20 14:19:08 crc kubenswrapper[4895]: E0320 14:19:08.983617 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51\": container with ID starting with 084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51 not found: ID does not exist" containerID="084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51" Mar 20 14:19:08 crc kubenswrapper[4895]: I0320 14:19:08.983644 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51"} err="failed to get container status \"084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51\": rpc error: code = NotFound desc = could not find container \"084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51\": container with ID starting with 084ed930725f9c251fdde8694fb74ad13e01f2463a1ed5cd8aa39be22b32ae51 not found: ID does not exist" Mar 20 14:19:09 crc kubenswrapper[4895]: I0320 14:19:09.226025 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" path="/var/lib/kubelet/pods/bfeb694c-7cb8-4949-863d-bb135058a418/volumes" Mar 20 14:19:22 crc kubenswrapper[4895]: I0320 14:19:22.297535 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:19:22 crc kubenswrapper[4895]: I0320 14:19:22.298079 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:19:52 crc kubenswrapper[4895]: I0320 14:19:52.297079 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:19:52 crc kubenswrapper[4895]: I0320 14:19:52.297635 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.155776 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566940-t9bl8"] Mar 20 14:20:00 crc kubenswrapper[4895]: E0320 14:20:00.156997 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="extract-utilities" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.157015 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="extract-utilities" Mar 20 14:20:00 crc kubenswrapper[4895]: E0320 14:20:00.157046 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="extract-content" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.157054 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="extract-content" Mar 20 14:20:00 crc kubenswrapper[4895]: E0320 14:20:00.157075 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.157083 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.157353 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfeb694c-7cb8-4949-863d-bb135058a418" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.158329 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.160516 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.160818 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.161711 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.168872 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-t9bl8"] Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.344086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vsm\" (UniqueName: \"kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm\") pod \"auto-csr-approver-29566940-t9bl8\" (UID: \"447e4d29-c2ec-405f-96b0-968909b48eaf\") " pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.446041 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vsm\" (UniqueName: \"kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm\") pod \"auto-csr-approver-29566940-t9bl8\" (UID: \"447e4d29-c2ec-405f-96b0-968909b48eaf\") " pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.466229 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vsm\" (UniqueName: \"kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm\") pod \"auto-csr-approver-29566940-t9bl8\" (UID: \"447e4d29-c2ec-405f-96b0-968909b48eaf\") " pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.482269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:00 crc kubenswrapper[4895]: I0320 14:20:00.920143 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-t9bl8"] Mar 20 14:20:01 crc kubenswrapper[4895]: I0320 14:20:01.426314 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" event={"ID":"447e4d29-c2ec-405f-96b0-968909b48eaf","Type":"ContainerStarted","Data":"c5ddc046272bde446a1ecd337d2b77b438fd522ea8dec2b4386b18a0cd12dbd0"} Mar 20 14:20:02 crc kubenswrapper[4895]: I0320 14:20:02.437406 4895 generic.go:334] "Generic (PLEG): container finished" podID="447e4d29-c2ec-405f-96b0-968909b48eaf" containerID="f5c6a387bf0f55806c943e577d27608f07d6b06f7d9e8d7c082f9e11071d0edb" exitCode=0 Mar 20 14:20:02 crc kubenswrapper[4895]: I0320 14:20:02.437556 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" event={"ID":"447e4d29-c2ec-405f-96b0-968909b48eaf","Type":"ContainerDied","Data":"f5c6a387bf0f55806c943e577d27608f07d6b06f7d9e8d7c082f9e11071d0edb"} Mar 20 14:20:03 crc kubenswrapper[4895]: I0320 14:20:03.829085 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:03 crc kubenswrapper[4895]: I0320 14:20:03.922761 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9vsm\" (UniqueName: \"kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm\") pod \"447e4d29-c2ec-405f-96b0-968909b48eaf\" (UID: \"447e4d29-c2ec-405f-96b0-968909b48eaf\") " Mar 20 14:20:03 crc kubenswrapper[4895]: I0320 14:20:03.944624 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm" (OuterVolumeSpecName: "kube-api-access-t9vsm") pod "447e4d29-c2ec-405f-96b0-968909b48eaf" (UID: "447e4d29-c2ec-405f-96b0-968909b48eaf"). InnerVolumeSpecName "kube-api-access-t9vsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.025383 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9vsm\" (UniqueName: \"kubernetes.io/projected/447e4d29-c2ec-405f-96b0-968909b48eaf-kube-api-access-t9vsm\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.458144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" event={"ID":"447e4d29-c2ec-405f-96b0-968909b48eaf","Type":"ContainerDied","Data":"c5ddc046272bde446a1ecd337d2b77b438fd522ea8dec2b4386b18a0cd12dbd0"} Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.458196 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ddc046272bde446a1ecd337d2b77b438fd522ea8dec2b4386b18a0cd12dbd0" Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.458258 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-t9bl8" Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.920028 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-jpprb"] Mar 20 14:20:04 crc kubenswrapper[4895]: I0320 14:20:04.929748 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-jpprb"] Mar 20 14:20:05 crc kubenswrapper[4895]: I0320 14:20:05.222725 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbb5ed5-242f-4d8c-afe6-124d33991e31" path="/var/lib/kubelet/pods/0dbb5ed5-242f-4d8c-afe6-124d33991e31/volumes" Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.113860 4895 scope.go:117] "RemoveContainer" containerID="490d8d9c4be147598efeeac037e1926dff50769bfb47405f7a94380608a5e03b" Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.296995 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.297060 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.297107 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.297909 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.297975 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351" gracePeriod=600 Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.669970 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351" exitCode=0 Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.670192 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351"} Mar 20 14:20:22 crc kubenswrapper[4895]: I0320 14:20:22.670300 4895 scope.go:117] "RemoveContainer" containerID="8765ea1f3e1e1dc2849a485e00dc9c5db3d365b644901ffe4c23e10b747b0633" Mar 20 14:20:23 crc kubenswrapper[4895]: I0320 14:20:23.681965 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d"} Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.234717 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:33 crc kubenswrapper[4895]: E0320 14:20:33.235849 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447e4d29-c2ec-405f-96b0-968909b48eaf" containerName="oc" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.235868 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="447e4d29-c2ec-405f-96b0-968909b48eaf" containerName="oc" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.236136 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="447e4d29-c2ec-405f-96b0-968909b48eaf" containerName="oc" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.238343 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.261685 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.359148 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.359253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2l5b\" (UniqueName: \"kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.359297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.461016 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.461561 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.461643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2l5b\" (UniqueName: \"kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.462070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.462103 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.483064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2l5b\" (UniqueName: \"kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b\") pod \"redhat-marketplace-jgqth\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:33 crc kubenswrapper[4895]: I0320 14:20:33.567934 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:34 crc kubenswrapper[4895]: I0320 14:20:34.052362 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:34 crc kubenswrapper[4895]: W0320 14:20:34.060410 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ba8b01_cbc4_48e2_ad57_f268fb7a4337.slice/crio-e7fe863bc8495b8532803c182dad72325d7781531c579ee587f8cbd8bf7751f0 WatchSource:0}: Error finding container e7fe863bc8495b8532803c182dad72325d7781531c579ee587f8cbd8bf7751f0: Status 404 returned error can't find the container with id e7fe863bc8495b8532803c182dad72325d7781531c579ee587f8cbd8bf7751f0 Mar 20 14:20:34 crc kubenswrapper[4895]: I0320 14:20:34.784724 4895 generic.go:334] "Generic (PLEG): container finished" podID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerID="6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a" exitCode=0 Mar 20 14:20:34 crc kubenswrapper[4895]: I0320 14:20:34.784841 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerDied","Data":"6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a"} Mar 20 14:20:34 crc kubenswrapper[4895]: I0320 14:20:34.785024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerStarted","Data":"e7fe863bc8495b8532803c182dad72325d7781531c579ee587f8cbd8bf7751f0"} Mar 20 14:20:35 crc kubenswrapper[4895]: I0320 14:20:35.795946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerStarted","Data":"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681"} Mar 20 14:20:36 crc kubenswrapper[4895]: I0320 14:20:36.807214 4895 generic.go:334] "Generic (PLEG): container finished" podID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerID="d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681" exitCode=0 Mar 20 14:20:36 crc kubenswrapper[4895]: I0320 14:20:36.807330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerDied","Data":"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681"} Mar 20 14:20:37 crc kubenswrapper[4895]: I0320 14:20:37.817706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerStarted","Data":"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9"} Mar 20 14:20:37 crc kubenswrapper[4895]: I0320 14:20:37.842772 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jgqth" podStartSLOduration=2.381246633 podStartE2EDuration="4.842754848s" podCreationTimestamp="2026-03-20 14:20:33 +0000 UTC" firstStartedPulling="2026-03-20 14:20:34.787778392 +0000 UTC m=+3534.297497368" lastFinishedPulling="2026-03-20 14:20:37.249286607 +0000 UTC m=+3536.759005583" observedRunningTime="2026-03-20 14:20:37.838766581 +0000 UTC m=+3537.348485547" watchObservedRunningTime="2026-03-20 14:20:37.842754848 +0000 UTC m=+3537.352473814" Mar 20 14:20:43 crc kubenswrapper[4895]: I0320 14:20:43.568956 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:43 crc kubenswrapper[4895]: I0320 14:20:43.569524 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:43 crc kubenswrapper[4895]: I0320 14:20:43.629988 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:43 crc kubenswrapper[4895]: I0320 14:20:43.926853 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:43 crc kubenswrapper[4895]: I0320 14:20:43.982976 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:45 crc kubenswrapper[4895]: I0320 14:20:45.904827 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jgqth" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="registry-server" containerID="cri-o://71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9" gracePeriod=2 Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.858494 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.916337 4895 generic.go:334] "Generic (PLEG): container finished" podID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerID="71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9" exitCode=0 Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.916383 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jgqth" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.916382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerDied","Data":"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9"} Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.916559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jgqth" event={"ID":"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337","Type":"ContainerDied","Data":"e7fe863bc8495b8532803c182dad72325d7781531c579ee587f8cbd8bf7751f0"} Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.916600 4895 scope.go:117] "RemoveContainer" containerID="71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.946585 4895 scope.go:117] "RemoveContainer" containerID="d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.969528 4895 scope.go:117] "RemoveContainer" containerID="6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.985958 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content\") pod \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.986008 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2l5b\" (UniqueName: \"kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b\") pod \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.986040 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities\") pod \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\" (UID: \"a8ba8b01-cbc4-48e2-ad57-f268fb7a4337\") " Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.987000 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities" (OuterVolumeSpecName: "utilities") pod "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" (UID: "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:46 crc kubenswrapper[4895]: I0320 14:20:46.992773 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b" (OuterVolumeSpecName: "kube-api-access-h2l5b") pod "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" (UID: "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337"). InnerVolumeSpecName "kube-api-access-h2l5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.011112 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" (UID: "a8ba8b01-cbc4-48e2-ad57-f268fb7a4337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.081540 4895 scope.go:117] "RemoveContainer" containerID="71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9" Mar 20 14:20:47 crc kubenswrapper[4895]: E0320 14:20:47.081910 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9\": container with ID starting with 71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9 not found: ID does not exist" containerID="71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.081955 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9"} err="failed to get container status \"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9\": rpc error: code = NotFound desc = could not find container \"71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9\": container with ID starting with 71e6bd1cde7ec11d00449cb712f3435d57e03d6c7d958738089f8102133c6ac9 not found: ID does not exist" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.081989 4895 scope.go:117] "RemoveContainer" containerID="d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681" Mar 20 14:20:47 crc kubenswrapper[4895]: E0320 14:20:47.082305 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681\": container with ID starting with d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681 not found: ID does not exist" containerID="d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.082371 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681"} err="failed to get container status \"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681\": rpc error: code = NotFound desc = could not find container \"d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681\": container with ID starting with d50b68849e34604a9b3ab59fb139b235c718117554b50a9b12c9fb7a5a081681 not found: ID does not exist" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.082425 4895 scope.go:117] "RemoveContainer" containerID="6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a" Mar 20 14:20:47 crc kubenswrapper[4895]: E0320 14:20:47.083054 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a\": container with ID starting with 6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a not found: ID does not exist" containerID="6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.083085 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a"} err="failed to get container status \"6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a\": rpc error: code = NotFound desc = could not find container \"6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a\": container with ID starting with 6e01e24a6a3e8ea8b0f6d7b28629b2b263fb75d454471d7f986ae8762c9e887a not found: ID does not exist" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.088730 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.088772 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2l5b\" (UniqueName: \"kubernetes.io/projected/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-kube-api-access-h2l5b\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.088790 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.264466 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:47 crc kubenswrapper[4895]: I0320 14:20:47.275464 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jgqth"] Mar 20 14:20:49 crc kubenswrapper[4895]: I0320 14:20:49.229198 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" path="/var/lib/kubelet/pods/a8ba8b01-cbc4-48e2-ad57-f268fb7a4337/volumes" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.141417 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566942-wkpmv"] Mar 20 14:22:00 crc kubenswrapper[4895]: E0320 14:22:00.142539 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="extract-content" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.142560 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="extract-content" Mar 20 14:22:00 crc kubenswrapper[4895]: E0320 14:22:00.142601 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="extract-utilities" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.142609 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="extract-utilities" Mar 20 14:22:00 crc kubenswrapper[4895]: E0320 14:22:00.142640 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.142648 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.142883 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ba8b01-cbc4-48e2-ad57-f268fb7a4337" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.143821 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.146619 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.146804 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.147535 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.155895 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-wkpmv"] Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.248911 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94xx\" (UniqueName: \"kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx\") pod \"auto-csr-approver-29566942-wkpmv\" (UID: \"94b9d241-c303-49d9-b9d1-4062e8449dd4\") " pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.369283 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94xx\" (UniqueName: \"kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx\") pod \"auto-csr-approver-29566942-wkpmv\" (UID: \"94b9d241-c303-49d9-b9d1-4062e8449dd4\") " pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.393906 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94xx\" (UniqueName: \"kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx\") pod \"auto-csr-approver-29566942-wkpmv\" (UID: \"94b9d241-c303-49d9-b9d1-4062e8449dd4\") " pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.466546 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.905333 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-wkpmv"] Mar 20 14:22:00 crc kubenswrapper[4895]: I0320 14:22:00.909948 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:22:01 crc kubenswrapper[4895]: I0320 14:22:01.628174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" event={"ID":"94b9d241-c303-49d9-b9d1-4062e8449dd4","Type":"ContainerStarted","Data":"e1107251a91e344758ec72f2aa03e5a7f60b86b835a4aed0707d59fa1e48a744"} Mar 20 14:22:03 crc kubenswrapper[4895]: I0320 14:22:03.647065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" event={"ID":"94b9d241-c303-49d9-b9d1-4062e8449dd4","Type":"ContainerDied","Data":"439220bb99e216fc6d9c2fe51e28bd1de403cb1f61fd9cd7f40e9aeb97765886"} Mar 20 14:22:03 crc kubenswrapper[4895]: I0320 14:22:03.647422 4895 generic.go:334] "Generic (PLEG): container finished" podID="94b9d241-c303-49d9-b9d1-4062e8449dd4" containerID="439220bb99e216fc6d9c2fe51e28bd1de403cb1f61fd9cd7f40e9aeb97765886" exitCode=0 Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.069793 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.126353 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94xx\" (UniqueName: \"kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx\") pod \"94b9d241-c303-49d9-b9d1-4062e8449dd4\" (UID: \"94b9d241-c303-49d9-b9d1-4062e8449dd4\") " Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.134328 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx" (OuterVolumeSpecName: "kube-api-access-x94xx") pod "94b9d241-c303-49d9-b9d1-4062e8449dd4" (UID: "94b9d241-c303-49d9-b9d1-4062e8449dd4"). InnerVolumeSpecName "kube-api-access-x94xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.229172 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94xx\" (UniqueName: \"kubernetes.io/projected/94b9d241-c303-49d9-b9d1-4062e8449dd4-kube-api-access-x94xx\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.666080 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" event={"ID":"94b9d241-c303-49d9-b9d1-4062e8449dd4","Type":"ContainerDied","Data":"e1107251a91e344758ec72f2aa03e5a7f60b86b835a4aed0707d59fa1e48a744"} Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.666450 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1107251a91e344758ec72f2aa03e5a7f60b86b835a4aed0707d59fa1e48a744" Mar 20 14:22:05 crc kubenswrapper[4895]: I0320 14:22:05.666133 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-wkpmv" Mar 20 14:22:06 crc kubenswrapper[4895]: I0320 14:22:06.141600 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-ch8pz"] Mar 20 14:22:06 crc kubenswrapper[4895]: I0320 14:22:06.152363 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-ch8pz"] Mar 20 14:22:07 crc kubenswrapper[4895]: I0320 14:22:07.224551 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc568a3-92a8-40ab-9fe2-da6100500652" path="/var/lib/kubelet/pods/5fc568a3-92a8-40ab-9fe2-da6100500652/volumes" Mar 20 14:22:22 crc kubenswrapper[4895]: I0320 14:22:22.221577 4895 scope.go:117] "RemoveContainer" containerID="c901bebdeb73549a39593e48b120e2b2d1d4f10a3e013001edbf7e88cc27a93e" Mar 20 14:22:22 crc kubenswrapper[4895]: I0320 14:22:22.297087 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:22:22 crc kubenswrapper[4895]: I0320 14:22:22.297140 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:22:52 crc kubenswrapper[4895]: I0320 14:22:52.297516 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:22:52 crc kubenswrapper[4895]: I0320 14:22:52.298040 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:23:22 crc kubenswrapper[4895]: I0320 14:23:22.300719 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:23:22 crc kubenswrapper[4895]: I0320 14:23:22.301269 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:23:22 crc kubenswrapper[4895]: I0320 14:23:22.301322 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:23:22 crc kubenswrapper[4895]: I0320 14:23:22.302144 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:23:22 crc kubenswrapper[4895]: I0320 14:23:22.302198 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" gracePeriod=600 Mar 20 14:23:22 crc kubenswrapper[4895]: E0320 14:23:22.554701 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:23:23 crc kubenswrapper[4895]: I0320 14:23:23.386772 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" exitCode=0 Mar 20 14:23:23 crc kubenswrapper[4895]: I0320 14:23:23.386870 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d"} Mar 20 14:23:23 crc kubenswrapper[4895]: I0320 14:23:23.386922 4895 scope.go:117] "RemoveContainer" containerID="fd21af21c0610037c853b443044efaee805090b8319136b3897ac4824f236351" Mar 20 14:23:23 crc kubenswrapper[4895]: I0320 14:23:23.387902 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:23:23 crc kubenswrapper[4895]: E0320 14:23:23.388381 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:23:38 crc kubenswrapper[4895]: I0320 14:23:38.212229 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:23:38 crc kubenswrapper[4895]: E0320 14:23:38.212957 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:23:50 crc kubenswrapper[4895]: I0320 14:23:50.211688 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:23:50 crc kubenswrapper[4895]: E0320 14:23:50.212555 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.190507 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566944-lgxm8"] Mar 20 14:24:00 crc kubenswrapper[4895]: E0320 14:24:00.192876 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9d241-c303-49d9-b9d1-4062e8449dd4" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.197686 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9d241-c303-49d9-b9d1-4062e8449dd4" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.198189 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b9d241-c303-49d9-b9d1-4062e8449dd4" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.199149 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.203640 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.205100 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.205273 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.208616 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-lgxm8"] Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.353623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mx4\" (UniqueName: \"kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4\") pod \"auto-csr-approver-29566944-lgxm8\" (UID: \"06b2b4a1-0e22-479c-9e15-527408b4b9cc\") " pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.455736 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mx4\" (UniqueName: \"kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4\") pod \"auto-csr-approver-29566944-lgxm8\" (UID: \"06b2b4a1-0e22-479c-9e15-527408b4b9cc\") " pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.476445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mx4\" (UniqueName: \"kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4\") pod \"auto-csr-approver-29566944-lgxm8\" (UID: \"06b2b4a1-0e22-479c-9e15-527408b4b9cc\") " pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.523593 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:00 crc kubenswrapper[4895]: I0320 14:24:00.990842 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-lgxm8"] Mar 20 14:24:01 crc kubenswrapper[4895]: I0320 14:24:01.807447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" event={"ID":"06b2b4a1-0e22-479c-9e15-527408b4b9cc","Type":"ContainerStarted","Data":"abd396eb22dba38b1233711dab7451f59bf7a87d2aa02725f3392f7c9aeb2e7a"} Mar 20 14:24:02 crc kubenswrapper[4895]: I0320 14:24:02.211325 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:24:02 crc kubenswrapper[4895]: E0320 14:24:02.212035 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:24:02 crc kubenswrapper[4895]: I0320 14:24:02.816981 4895 generic.go:334] "Generic (PLEG): container finished" podID="06b2b4a1-0e22-479c-9e15-527408b4b9cc" containerID="cb7b5d075ede93ffba649482b6c9dd9965b57fd370aba5b1913173d8f573b6dd" exitCode=0 Mar 20 14:24:02 crc kubenswrapper[4895]: I0320 14:24:02.817037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" event={"ID":"06b2b4a1-0e22-479c-9e15-527408b4b9cc","Type":"ContainerDied","Data":"cb7b5d075ede93ffba649482b6c9dd9965b57fd370aba5b1913173d8f573b6dd"} Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.328806 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.436098 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5mx4\" (UniqueName: \"kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4\") pod \"06b2b4a1-0e22-479c-9e15-527408b4b9cc\" (UID: \"06b2b4a1-0e22-479c-9e15-527408b4b9cc\") " Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.448609 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4" (OuterVolumeSpecName: "kube-api-access-m5mx4") pod "06b2b4a1-0e22-479c-9e15-527408b4b9cc" (UID: "06b2b4a1-0e22-479c-9e15-527408b4b9cc"). InnerVolumeSpecName "kube-api-access-m5mx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.541995 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5mx4\" (UniqueName: \"kubernetes.io/projected/06b2b4a1-0e22-479c-9e15-527408b4b9cc-kube-api-access-m5mx4\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.836218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" event={"ID":"06b2b4a1-0e22-479c-9e15-527408b4b9cc","Type":"ContainerDied","Data":"abd396eb22dba38b1233711dab7451f59bf7a87d2aa02725f3392f7c9aeb2e7a"} Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.836265 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd396eb22dba38b1233711dab7451f59bf7a87d2aa02725f3392f7c9aeb2e7a" Mar 20 14:24:04 crc kubenswrapper[4895]: I0320 14:24:04.836330 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-lgxm8" Mar 20 14:24:05 crc kubenswrapper[4895]: I0320 14:24:05.419289 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-gp7bk"] Mar 20 14:24:05 crc kubenswrapper[4895]: I0320 14:24:05.427628 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-gp7bk"] Mar 20 14:24:07 crc kubenswrapper[4895]: I0320 14:24:07.223975 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3094ddb0-fb66-43a1-8e8a-da51248c25ce" path="/var/lib/kubelet/pods/3094ddb0-fb66-43a1-8e8a-da51248c25ce/volumes" Mar 20 14:24:16 crc kubenswrapper[4895]: I0320 14:24:16.212733 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:24:16 crc kubenswrapper[4895]: E0320 14:24:16.213613 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:24:22 crc kubenswrapper[4895]: I0320 14:24:22.308481 4895 scope.go:117] "RemoveContainer" containerID="19f580266f7f00e77ce0f46a8abaf9c8316752c400e7455f4bc4576e0aa06b5d" Mar 20 14:24:27 crc kubenswrapper[4895]: I0320 14:24:27.211683 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:24:27 crc kubenswrapper[4895]: E0320 14:24:27.212378 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:24:41 crc kubenswrapper[4895]: I0320 14:24:41.221127 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:24:41 crc kubenswrapper[4895]: E0320 14:24:41.222973 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:24:52 crc kubenswrapper[4895]: I0320 14:24:52.211744 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:24:52 crc kubenswrapper[4895]: E0320 14:24:52.212908 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:25:07 crc kubenswrapper[4895]: I0320 14:25:07.212622 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:25:07 crc kubenswrapper[4895]: E0320 14:25:07.213744 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:25:19 crc kubenswrapper[4895]: I0320 14:25:19.212685 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:25:19 crc kubenswrapper[4895]: E0320 14:25:19.213591 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:25:31 crc kubenswrapper[4895]: I0320 14:25:31.218491 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:25:31 crc kubenswrapper[4895]: E0320 14:25:31.219257 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:25:44 crc kubenswrapper[4895]: I0320 14:25:44.212471 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:25:44 crc kubenswrapper[4895]: E0320 14:25:44.214544 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:25:55 crc kubenswrapper[4895]: I0320 14:25:55.212712 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:25:55 crc kubenswrapper[4895]: E0320 14:25:55.213676 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.145745 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566946-fltp7"] Mar 20 14:26:00 crc kubenswrapper[4895]: E0320 14:26:00.146828 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b2b4a1-0e22-479c-9e15-527408b4b9cc" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.146844 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b2b4a1-0e22-479c-9e15-527408b4b9cc" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.147138 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b2b4a1-0e22-479c-9e15-527408b4b9cc" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.148057 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.150717 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.150858 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.150901 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.166451 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-fltp7"] Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.265333 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qp2g\" (UniqueName: \"kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g\") pod \"auto-csr-approver-29566946-fltp7\" (UID: \"9993560a-6827-4e64-9690-b2fdf2f91c8d\") " pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.368100 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qp2g\" (UniqueName: \"kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g\") pod \"auto-csr-approver-29566946-fltp7\" (UID: \"9993560a-6827-4e64-9690-b2fdf2f91c8d\") " pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.387872 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qp2g\" (UniqueName: \"kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g\") pod \"auto-csr-approver-29566946-fltp7\" (UID: \"9993560a-6827-4e64-9690-b2fdf2f91c8d\") " pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.471965 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:00 crc kubenswrapper[4895]: I0320 14:26:00.946925 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-fltp7"] Mar 20 14:26:01 crc kubenswrapper[4895]: I0320 14:26:01.922825 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-fltp7" event={"ID":"9993560a-6827-4e64-9690-b2fdf2f91c8d","Type":"ContainerStarted","Data":"55de51a7cd8b5c7a140271da8035258221a2db75708189a7d094b18dd45a45c2"} Mar 20 14:26:03 crc kubenswrapper[4895]: I0320 14:26:03.944148 4895 generic.go:334] "Generic (PLEG): container finished" podID="9993560a-6827-4e64-9690-b2fdf2f91c8d" containerID="d4dc636e22497b7319a4285023569d5fec67d298a2e2bddea27e60c518b8d329" exitCode=0 Mar 20 14:26:03 crc kubenswrapper[4895]: I0320 14:26:03.944230 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-fltp7" event={"ID":"9993560a-6827-4e64-9690-b2fdf2f91c8d","Type":"ContainerDied","Data":"d4dc636e22497b7319a4285023569d5fec67d298a2e2bddea27e60c518b8d329"} Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.440476 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.491549 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qp2g\" (UniqueName: \"kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g\") pod \"9993560a-6827-4e64-9690-b2fdf2f91c8d\" (UID: \"9993560a-6827-4e64-9690-b2fdf2f91c8d\") " Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.512419 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g" (OuterVolumeSpecName: "kube-api-access-7qp2g") pod "9993560a-6827-4e64-9690-b2fdf2f91c8d" (UID: "9993560a-6827-4e64-9690-b2fdf2f91c8d"). InnerVolumeSpecName "kube-api-access-7qp2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.594514 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qp2g\" (UniqueName: \"kubernetes.io/projected/9993560a-6827-4e64-9690-b2fdf2f91c8d-kube-api-access-7qp2g\") on node \"crc\" DevicePath \"\"" Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.963267 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-fltp7" event={"ID":"9993560a-6827-4e64-9690-b2fdf2f91c8d","Type":"ContainerDied","Data":"55de51a7cd8b5c7a140271da8035258221a2db75708189a7d094b18dd45a45c2"} Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.963307 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55de51a7cd8b5c7a140271da8035258221a2db75708189a7d094b18dd45a45c2" Mar 20 14:26:05 crc kubenswrapper[4895]: I0320 14:26:05.963355 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-fltp7" Mar 20 14:26:06 crc kubenswrapper[4895]: I0320 14:26:06.510079 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-t9bl8"] Mar 20 14:26:06 crc kubenswrapper[4895]: I0320 14:26:06.525640 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-t9bl8"] Mar 20 14:26:07 crc kubenswrapper[4895]: I0320 14:26:07.225380 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="447e4d29-c2ec-405f-96b0-968909b48eaf" path="/var/lib/kubelet/pods/447e4d29-c2ec-405f-96b0-968909b48eaf/volumes" Mar 20 14:26:08 crc kubenswrapper[4895]: I0320 14:26:08.211523 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:26:08 crc kubenswrapper[4895]: E0320 14:26:08.211787 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:26:22 crc kubenswrapper[4895]: I0320 14:26:22.212242 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:26:22 crc kubenswrapper[4895]: E0320 14:26:22.213078 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:26:22 crc kubenswrapper[4895]: I0320 14:26:22.419957 4895 scope.go:117] "RemoveContainer" containerID="f5c6a387bf0f55806c943e577d27608f07d6b06f7d9e8d7c082f9e11071d0edb" Mar 20 14:26:36 crc kubenswrapper[4895]: I0320 14:26:36.212482 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:26:36 crc kubenswrapper[4895]: E0320 14:26:36.213359 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:26:47 crc kubenswrapper[4895]: I0320 14:26:47.212189 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:26:47 crc kubenswrapper[4895]: E0320 14:26:47.212902 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:00 crc kubenswrapper[4895]: I0320 14:27:00.211927 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:27:00 crc kubenswrapper[4895]: E0320 14:27:00.212852 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:12 crc kubenswrapper[4895]: I0320 14:27:12.212144 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:27:12 crc kubenswrapper[4895]: E0320 14:27:12.213562 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.849709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:14 crc kubenswrapper[4895]: E0320 14:27:14.850443 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9993560a-6827-4e64-9690-b2fdf2f91c8d" containerName="oc" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.850456 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9993560a-6827-4e64-9690-b2fdf2f91c8d" containerName="oc" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.850707 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9993560a-6827-4e64-9690-b2fdf2f91c8d" containerName="oc" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.852543 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.877418 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.901264 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.901347 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jvh4\" (UniqueName: \"kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:14 crc kubenswrapper[4895]: I0320 14:27:14.901380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.003024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.003115 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jvh4\" (UniqueName: \"kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.003149 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.003663 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.003734 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.021092 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jvh4\" (UniqueName: \"kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4\") pod \"certified-operators-hqdq6\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.181398 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:15 crc kubenswrapper[4895]: I0320 14:27:15.665221 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:16 crc kubenswrapper[4895]: I0320 14:27:16.678784 4895 generic.go:334] "Generic (PLEG): container finished" podID="792c98e7-6003-49a5-b801-2209b7859e54" containerID="bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03" exitCode=0 Mar 20 14:27:16 crc kubenswrapper[4895]: I0320 14:27:16.679042 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerDied","Data":"bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03"} Mar 20 14:27:16 crc kubenswrapper[4895]: I0320 14:27:16.679067 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerStarted","Data":"1cb0b1feb5d350a01481085c431271e45fce9616ecfca18f002e9c41037ec0ce"} Mar 20 14:27:16 crc kubenswrapper[4895]: I0320 14:27:16.681169 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:27:18 crc kubenswrapper[4895]: I0320 14:27:18.696369 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerStarted","Data":"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0"} Mar 20 14:27:19 crc kubenswrapper[4895]: I0320 14:27:19.708593 4895 generic.go:334] "Generic (PLEG): container finished" podID="792c98e7-6003-49a5-b801-2209b7859e54" containerID="103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0" exitCode=0 Mar 20 14:27:19 crc kubenswrapper[4895]: I0320 14:27:19.708739 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerDied","Data":"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0"} Mar 20 14:27:20 crc kubenswrapper[4895]: I0320 14:27:20.721122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerStarted","Data":"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e"} Mar 20 14:27:20 crc kubenswrapper[4895]: I0320 14:27:20.743704 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqdq6" podStartSLOduration=3.213588085 podStartE2EDuration="6.743681754s" podCreationTimestamp="2026-03-20 14:27:14 +0000 UTC" firstStartedPulling="2026-03-20 14:27:16.680938493 +0000 UTC m=+3936.190657459" lastFinishedPulling="2026-03-20 14:27:20.211032152 +0000 UTC m=+3939.720751128" observedRunningTime="2026-03-20 14:27:20.737430572 +0000 UTC m=+3940.247149558" watchObservedRunningTime="2026-03-20 14:27:20.743681754 +0000 UTC m=+3940.253400720" Mar 20 14:27:25 crc kubenswrapper[4895]: I0320 14:27:25.182533 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:25 crc kubenswrapper[4895]: I0320 14:27:25.182854 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:25 crc kubenswrapper[4895]: I0320 14:27:25.232073 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:25 crc kubenswrapper[4895]: I0320 14:27:25.819416 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:25 crc kubenswrapper[4895]: I0320 14:27:25.879645 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:26 crc kubenswrapper[4895]: I0320 14:27:26.211857 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:27:26 crc kubenswrapper[4895]: E0320 14:27:26.212148 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:27 crc kubenswrapper[4895]: I0320 14:27:27.782806 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqdq6" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="registry-server" containerID="cri-o://5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e" gracePeriod=2 Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.422279 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.499163 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jvh4\" (UniqueName: \"kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4\") pod \"792c98e7-6003-49a5-b801-2209b7859e54\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.499535 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities\") pod \"792c98e7-6003-49a5-b801-2209b7859e54\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.499635 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content\") pod \"792c98e7-6003-49a5-b801-2209b7859e54\" (UID: \"792c98e7-6003-49a5-b801-2209b7859e54\") " Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.500349 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities" (OuterVolumeSpecName: "utilities") pod "792c98e7-6003-49a5-b801-2209b7859e54" (UID: "792c98e7-6003-49a5-b801-2209b7859e54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.505247 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4" (OuterVolumeSpecName: "kube-api-access-9jvh4") pod "792c98e7-6003-49a5-b801-2209b7859e54" (UID: "792c98e7-6003-49a5-b801-2209b7859e54"). InnerVolumeSpecName "kube-api-access-9jvh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.553196 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "792c98e7-6003-49a5-b801-2209b7859e54" (UID: "792c98e7-6003-49a5-b801-2209b7859e54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.602297 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.602324 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792c98e7-6003-49a5-b801-2209b7859e54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.602335 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jvh4\" (UniqueName: \"kubernetes.io/projected/792c98e7-6003-49a5-b801-2209b7859e54-kube-api-access-9jvh4\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.793984 4895 generic.go:334] "Generic (PLEG): container finished" podID="792c98e7-6003-49a5-b801-2209b7859e54" containerID="5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e" exitCode=0 Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.794024 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqdq6" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.794036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerDied","Data":"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e"} Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.794063 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqdq6" event={"ID":"792c98e7-6003-49a5-b801-2209b7859e54","Type":"ContainerDied","Data":"1cb0b1feb5d350a01481085c431271e45fce9616ecfca18f002e9c41037ec0ce"} Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.794082 4895 scope.go:117] "RemoveContainer" containerID="5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.825970 4895 scope.go:117] "RemoveContainer" containerID="103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.832157 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.841811 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqdq6"] Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.850824 4895 scope.go:117] "RemoveContainer" containerID="bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.902682 4895 scope.go:117] "RemoveContainer" containerID="5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e" Mar 20 14:27:28 crc kubenswrapper[4895]: E0320 14:27:28.904829 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e\": container with ID starting with 5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e not found: ID does not exist" containerID="5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.904870 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e"} err="failed to get container status \"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e\": rpc error: code = NotFound desc = could not find container \"5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e\": container with ID starting with 5f83ced51583d520f065d3363b771f0a3aeac193146acdbc4837599cc054794e not found: ID does not exist" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.904901 4895 scope.go:117] "RemoveContainer" containerID="103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0" Mar 20 14:27:28 crc kubenswrapper[4895]: E0320 14:27:28.905375 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0\": container with ID starting with 103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0 not found: ID does not exist" containerID="103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.905435 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0"} err="failed to get container status \"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0\": rpc error: code = NotFound desc = could not find container \"103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0\": container with ID starting with 103d56aaccf150c51bfafd443c631fc0e4d67249be9d3641fe1f1e96d8480db0 not found: ID does not exist" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.905467 4895 scope.go:117] "RemoveContainer" containerID="bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03" Mar 20 14:27:28 crc kubenswrapper[4895]: E0320 14:27:28.906681 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03\": container with ID starting with bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03 not found: ID does not exist" containerID="bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03" Mar 20 14:27:28 crc kubenswrapper[4895]: I0320 14:27:28.906740 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03"} err="failed to get container status \"bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03\": rpc error: code = NotFound desc = could not find container \"bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03\": container with ID starting with bb343d35c5641c9639c2d8d4c62220d9d33b43209712d3bf0ba983d28c7ecb03 not found: ID does not exist" Mar 20 14:27:29 crc kubenswrapper[4895]: I0320 14:27:29.234589 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792c98e7-6003-49a5-b801-2209b7859e54" path="/var/lib/kubelet/pods/792c98e7-6003-49a5-b801-2209b7859e54/volumes" Mar 20 14:27:39 crc kubenswrapper[4895]: I0320 14:27:39.212105 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:27:39 crc kubenswrapper[4895]: E0320 14:27:39.213068 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.417880 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:27:46 crc kubenswrapper[4895]: E0320 14:27:46.419202 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="extract-content" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.419218 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="extract-content" Mar 20 14:27:46 crc kubenswrapper[4895]: E0320 14:27:46.419227 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="registry-server" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.419233 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="registry-server" Mar 20 14:27:46 crc kubenswrapper[4895]: E0320 14:27:46.419248 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="extract-utilities" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.419255 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="extract-utilities" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.419457 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="792c98e7-6003-49a5-b801-2209b7859e54" containerName="registry-server" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.420898 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.429888 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.593698 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4zdl\" (UniqueName: \"kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.594028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.594309 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.695844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.695929 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4zdl\" (UniqueName: \"kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.695964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.696415 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.696467 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.718422 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4zdl\" (UniqueName: \"kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl\") pod \"redhat-operators-fqtgc\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:46 crc kubenswrapper[4895]: I0320 14:27:46.777974 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:47 crc kubenswrapper[4895]: I0320 14:27:47.238131 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:27:47 crc kubenswrapper[4895]: I0320 14:27:47.984105 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerID="1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77" exitCode=0 Mar 20 14:27:47 crc kubenswrapper[4895]: I0320 14:27:47.984168 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerDied","Data":"1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77"} Mar 20 14:27:47 crc kubenswrapper[4895]: I0320 14:27:47.984447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerStarted","Data":"8f02d6d55e78e8e04f22d59e2a53ff7963ca4bd70a9c513dcfe81bf17bae48b0"} Mar 20 14:27:50 crc kubenswrapper[4895]: I0320 14:27:50.005482 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerStarted","Data":"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b"} Mar 20 14:27:52 crc kubenswrapper[4895]: I0320 14:27:52.212022 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:27:52 crc kubenswrapper[4895]: E0320 14:27:52.213063 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:27:55 crc kubenswrapper[4895]: I0320 14:27:55.059293 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerID="b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b" exitCode=0 Mar 20 14:27:55 crc kubenswrapper[4895]: I0320 14:27:55.059383 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerDied","Data":"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b"} Mar 20 14:27:56 crc kubenswrapper[4895]: I0320 14:27:56.078046 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerStarted","Data":"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211"} Mar 20 14:27:56 crc kubenswrapper[4895]: I0320 14:27:56.106325 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqtgc" podStartSLOduration=2.589146221 podStartE2EDuration="10.10630818s" podCreationTimestamp="2026-03-20 14:27:46 +0000 UTC" firstStartedPulling="2026-03-20 14:27:47.986295132 +0000 UTC m=+3967.496014088" lastFinishedPulling="2026-03-20 14:27:55.503457081 +0000 UTC m=+3975.013176047" observedRunningTime="2026-03-20 14:27:56.097323982 +0000 UTC m=+3975.607042958" watchObservedRunningTime="2026-03-20 14:27:56.10630818 +0000 UTC m=+3975.616027146" Mar 20 14:27:56 crc kubenswrapper[4895]: I0320 14:27:56.779332 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:56 crc kubenswrapper[4895]: I0320 14:27:56.779506 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:27:57 crc kubenswrapper[4895]: I0320 14:27:57.836355 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqtgc" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" probeResult="failure" output=< Mar 20 14:27:57 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:27:57 crc kubenswrapper[4895]: > Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.147792 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566948-w6qx8"] Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.149861 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.152486 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.152633 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.153838 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.159227 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-w6qx8"] Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.284659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mddp\" (UniqueName: \"kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp\") pod \"auto-csr-approver-29566948-w6qx8\" (UID: \"e55b9290-2bd0-46f7-acc2-f5c763035c8c\") " pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.386364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mddp\" (UniqueName: \"kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp\") pod \"auto-csr-approver-29566948-w6qx8\" (UID: \"e55b9290-2bd0-46f7-acc2-f5c763035c8c\") " pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.408251 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mddp\" (UniqueName: \"kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp\") pod \"auto-csr-approver-29566948-w6qx8\" (UID: \"e55b9290-2bd0-46f7-acc2-f5c763035c8c\") " pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.467770 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:00 crc kubenswrapper[4895]: I0320 14:28:00.914847 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-w6qx8"] Mar 20 14:28:01 crc kubenswrapper[4895]: I0320 14:28:01.126675 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" event={"ID":"e55b9290-2bd0-46f7-acc2-f5c763035c8c","Type":"ContainerStarted","Data":"35b2fad68e2fa05ba75ed02daa0ce8160e7fe9ab2617676dc33ccd0eb5db93d2"} Mar 20 14:28:03 crc kubenswrapper[4895]: I0320 14:28:03.150828 4895 generic.go:334] "Generic (PLEG): container finished" podID="e55b9290-2bd0-46f7-acc2-f5c763035c8c" containerID="62c8782aae693c07656c62a7632510574829646754fb327ce3a40e68fe2c4a9a" exitCode=0 Mar 20 14:28:03 crc kubenswrapper[4895]: I0320 14:28:03.150909 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" event={"ID":"e55b9290-2bd0-46f7-acc2-f5c763035c8c","Type":"ContainerDied","Data":"62c8782aae693c07656c62a7632510574829646754fb327ce3a40e68fe2c4a9a"} Mar 20 14:28:04 crc kubenswrapper[4895]: I0320 14:28:04.588346 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:04 crc kubenswrapper[4895]: I0320 14:28:04.690324 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mddp\" (UniqueName: \"kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp\") pod \"e55b9290-2bd0-46f7-acc2-f5c763035c8c\" (UID: \"e55b9290-2bd0-46f7-acc2-f5c763035c8c\") " Mar 20 14:28:04 crc kubenswrapper[4895]: I0320 14:28:04.695977 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp" (OuterVolumeSpecName: "kube-api-access-7mddp") pod "e55b9290-2bd0-46f7-acc2-f5c763035c8c" (UID: "e55b9290-2bd0-46f7-acc2-f5c763035c8c"). InnerVolumeSpecName "kube-api-access-7mddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:28:04 crc kubenswrapper[4895]: I0320 14:28:04.793068 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mddp\" (UniqueName: \"kubernetes.io/projected/e55b9290-2bd0-46f7-acc2-f5c763035c8c-kube-api-access-7mddp\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:05 crc kubenswrapper[4895]: I0320 14:28:05.173782 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" event={"ID":"e55b9290-2bd0-46f7-acc2-f5c763035c8c","Type":"ContainerDied","Data":"35b2fad68e2fa05ba75ed02daa0ce8160e7fe9ab2617676dc33ccd0eb5db93d2"} Mar 20 14:28:05 crc kubenswrapper[4895]: I0320 14:28:05.173821 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b2fad68e2fa05ba75ed02daa0ce8160e7fe9ab2617676dc33ccd0eb5db93d2" Mar 20 14:28:05 crc kubenswrapper[4895]: I0320 14:28:05.173863 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-w6qx8" Mar 20 14:28:05 crc kubenswrapper[4895]: I0320 14:28:05.666314 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-wkpmv"] Mar 20 14:28:05 crc kubenswrapper[4895]: I0320 14:28:05.678686 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-wkpmv"] Mar 20 14:28:06 crc kubenswrapper[4895]: I0320 14:28:06.212113 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:28:06 crc kubenswrapper[4895]: E0320 14:28:06.212720 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:28:07 crc kubenswrapper[4895]: I0320 14:28:07.228538 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b9d241-c303-49d9-b9d1-4062e8449dd4" path="/var/lib/kubelet/pods/94b9d241-c303-49d9-b9d1-4062e8449dd4/volumes" Mar 20 14:28:07 crc kubenswrapper[4895]: I0320 14:28:07.824278 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqtgc" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" probeResult="failure" output=< Mar 20 14:28:07 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:28:07 crc kubenswrapper[4895]: > Mar 20 14:28:17 crc kubenswrapper[4895]: I0320 14:28:17.211674 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:28:17 crc kubenswrapper[4895]: E0320 14:28:17.212655 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:28:17 crc kubenswrapper[4895]: I0320 14:28:17.825252 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqtgc" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" probeResult="failure" output=< Mar 20 14:28:17 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:28:17 crc kubenswrapper[4895]: > Mar 20 14:28:22 crc kubenswrapper[4895]: I0320 14:28:22.504634 4895 scope.go:117] "RemoveContainer" containerID="439220bb99e216fc6d9c2fe51e28bd1de403cb1f61fd9cd7f40e9aeb97765886" Mar 20 14:28:27 crc kubenswrapper[4895]: I0320 14:28:27.823479 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqtgc" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" probeResult="failure" output=< Mar 20 14:28:27 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:28:27 crc kubenswrapper[4895]: > Mar 20 14:28:31 crc kubenswrapper[4895]: I0320 14:28:31.222487 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:28:32 crc kubenswrapper[4895]: I0320 14:28:32.433411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c"} Mar 20 14:28:36 crc kubenswrapper[4895]: I0320 14:28:36.835020 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:28:36 crc kubenswrapper[4895]: I0320 14:28:36.884729 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:28:37 crc kubenswrapper[4895]: I0320 14:28:37.077809 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:28:38 crc kubenswrapper[4895]: I0320 14:28:38.489811 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqtgc" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" containerID="cri-o://486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211" gracePeriod=2 Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.033520 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.154366 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4zdl\" (UniqueName: \"kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl\") pod \"e6de948f-fd26-4691-9b52-a4b184e79d12\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.154693 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content\") pod \"e6de948f-fd26-4691-9b52-a4b184e79d12\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.154811 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities\") pod \"e6de948f-fd26-4691-9b52-a4b184e79d12\" (UID: \"e6de948f-fd26-4691-9b52-a4b184e79d12\") " Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.155787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities" (OuterVolumeSpecName: "utilities") pod "e6de948f-fd26-4691-9b52-a4b184e79d12" (UID: "e6de948f-fd26-4691-9b52-a4b184e79d12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.161359 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl" (OuterVolumeSpecName: "kube-api-access-q4zdl") pod "e6de948f-fd26-4691-9b52-a4b184e79d12" (UID: "e6de948f-fd26-4691-9b52-a4b184e79d12"). InnerVolumeSpecName "kube-api-access-q4zdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.258237 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.258737 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4zdl\" (UniqueName: \"kubernetes.io/projected/e6de948f-fd26-4691-9b52-a4b184e79d12-kube-api-access-q4zdl\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.309422 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6de948f-fd26-4691-9b52-a4b184e79d12" (UID: "e6de948f-fd26-4691-9b52-a4b184e79d12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.361660 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6de948f-fd26-4691-9b52-a4b184e79d12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.500322 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerID="486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211" exitCode=0 Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.500363 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerDied","Data":"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211"} Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.500414 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqtgc" event={"ID":"e6de948f-fd26-4691-9b52-a4b184e79d12","Type":"ContainerDied","Data":"8f02d6d55e78e8e04f22d59e2a53ff7963ca4bd70a9c513dcfe81bf17bae48b0"} Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.500444 4895 scope.go:117] "RemoveContainer" containerID="486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.500454 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqtgc" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.536689 4895 scope.go:117] "RemoveContainer" containerID="b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.539749 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.551735 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqtgc"] Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.559791 4895 scope.go:117] "RemoveContainer" containerID="1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.620643 4895 scope.go:117] "RemoveContainer" containerID="486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211" Mar 20 14:28:39 crc kubenswrapper[4895]: E0320 14:28:39.621119 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211\": container with ID starting with 486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211 not found: ID does not exist" containerID="486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.621158 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211"} err="failed to get container status \"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211\": rpc error: code = NotFound desc = could not find container \"486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211\": container with ID starting with 486483629cf0c44c8e0b30ee0a0d99f123c4dc57496103862562a29ed194a211 not found: ID does not exist" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.621183 4895 scope.go:117] "RemoveContainer" containerID="b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b" Mar 20 14:28:39 crc kubenswrapper[4895]: E0320 14:28:39.621680 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b\": container with ID starting with b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b not found: ID does not exist" containerID="b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.621711 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b"} err="failed to get container status \"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b\": rpc error: code = NotFound desc = could not find container \"b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b\": container with ID starting with b2272ec394cfa1e067db8ea5a8e56efa778941cbc7e8d1903dcd32a227efa17b not found: ID does not exist" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.621730 4895 scope.go:117] "RemoveContainer" containerID="1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77" Mar 20 14:28:39 crc kubenswrapper[4895]: E0320 14:28:39.622007 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77\": container with ID starting with 1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77 not found: ID does not exist" containerID="1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77" Mar 20 14:28:39 crc kubenswrapper[4895]: I0320 14:28:39.622033 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77"} err="failed to get container status \"1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77\": rpc error: code = NotFound desc = could not find container \"1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77\": container with ID starting with 1d44c374bad9744e124c3421c17709364fe8549e79180da61c41b7d4a648ae77 not found: ID does not exist" Mar 20 14:28:41 crc kubenswrapper[4895]: I0320 14:28:41.229285 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" path="/var/lib/kubelet/pods/e6de948f-fd26-4691-9b52-a4b184e79d12/volumes" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.692084 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:32 crc kubenswrapper[4895]: E0320 14:29:32.693463 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="extract-utilities" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693482 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="extract-utilities" Mar 20 14:29:32 crc kubenswrapper[4895]: E0320 14:29:32.693499 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="extract-content" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693506 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="extract-content" Mar 20 14:29:32 crc kubenswrapper[4895]: E0320 14:29:32.693518 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55b9290-2bd0-46f7-acc2-f5c763035c8c" containerName="oc" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693525 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55b9290-2bd0-46f7-acc2-f5c763035c8c" containerName="oc" Mar 20 14:29:32 crc kubenswrapper[4895]: E0320 14:29:32.693543 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693551 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693950 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6de948f-fd26-4691-9b52-a4b184e79d12" containerName="registry-server" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.693964 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55b9290-2bd0-46f7-acc2-f5c763035c8c" containerName="oc" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.695475 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.720116 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.755195 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.755380 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwklt\" (UniqueName: \"kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.755988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.857920 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwklt\" (UniqueName: \"kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.858196 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.858301 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.859177 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.859214 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:32 crc kubenswrapper[4895]: I0320 14:29:32.877232 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwklt\" (UniqueName: \"kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt\") pod \"community-operators-4hljn\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:33 crc kubenswrapper[4895]: I0320 14:29:33.019522 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:33 crc kubenswrapper[4895]: I0320 14:29:33.553682 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:34 crc kubenswrapper[4895]: I0320 14:29:34.258657 4895 generic.go:334] "Generic (PLEG): container finished" podID="58d06745-e676-4e44-bd6c-037a2719c81d" containerID="601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6" exitCode=0 Mar 20 14:29:34 crc kubenswrapper[4895]: I0320 14:29:34.258713 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerDied","Data":"601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6"} Mar 20 14:29:34 crc kubenswrapper[4895]: I0320 14:29:34.258983 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerStarted","Data":"7fa6df0b26c2af885bc2050a20ed186079cdc319e79cd7bcec4579ca8447e0df"} Mar 20 14:29:35 crc kubenswrapper[4895]: I0320 14:29:35.271118 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerStarted","Data":"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4"} Mar 20 14:29:37 crc kubenswrapper[4895]: I0320 14:29:37.293478 4895 generic.go:334] "Generic (PLEG): container finished" podID="58d06745-e676-4e44-bd6c-037a2719c81d" containerID="81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4" exitCode=0 Mar 20 14:29:37 crc kubenswrapper[4895]: I0320 14:29:37.293570 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerDied","Data":"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4"} Mar 20 14:29:39 crc kubenswrapper[4895]: I0320 14:29:39.314959 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerStarted","Data":"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68"} Mar 20 14:29:39 crc kubenswrapper[4895]: I0320 14:29:39.344512 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4hljn" podStartSLOduration=3.413696516 podStartE2EDuration="7.344486705s" podCreationTimestamp="2026-03-20 14:29:32 +0000 UTC" firstStartedPulling="2026-03-20 14:29:34.261181397 +0000 UTC m=+4073.770900363" lastFinishedPulling="2026-03-20 14:29:38.191971586 +0000 UTC m=+4077.701690552" observedRunningTime="2026-03-20 14:29:39.333047636 +0000 UTC m=+4078.842766602" watchObservedRunningTime="2026-03-20 14:29:39.344486705 +0000 UTC m=+4078.854205671" Mar 20 14:29:43 crc kubenswrapper[4895]: I0320 14:29:43.020793 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:43 crc kubenswrapper[4895]: I0320 14:29:43.021238 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:43 crc kubenswrapper[4895]: I0320 14:29:43.077564 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:43 crc kubenswrapper[4895]: I0320 14:29:43.415100 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:43 crc kubenswrapper[4895]: I0320 14:29:43.459493 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.382977 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4hljn" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="registry-server" containerID="cri-o://4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68" gracePeriod=2 Mar 20 14:29:45 crc kubenswrapper[4895]: E0320 14:29:45.679930 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d06745_e676_4e44_bd6c_037a2719c81d.slice/crio-conmon-4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.904732 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.954437 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities\") pod \"58d06745-e676-4e44-bd6c-037a2719c81d\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.954524 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwklt\" (UniqueName: \"kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt\") pod \"58d06745-e676-4e44-bd6c-037a2719c81d\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.954693 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content\") pod \"58d06745-e676-4e44-bd6c-037a2719c81d\" (UID: \"58d06745-e676-4e44-bd6c-037a2719c81d\") " Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.956293 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities" (OuterVolumeSpecName: "utilities") pod "58d06745-e676-4e44-bd6c-037a2719c81d" (UID: "58d06745-e676-4e44-bd6c-037a2719c81d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:29:45 crc kubenswrapper[4895]: I0320 14:29:45.969703 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt" (OuterVolumeSpecName: "kube-api-access-gwklt") pod "58d06745-e676-4e44-bd6c-037a2719c81d" (UID: "58d06745-e676-4e44-bd6c-037a2719c81d"). InnerVolumeSpecName "kube-api-access-gwklt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.057477 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.057513 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwklt\" (UniqueName: \"kubernetes.io/projected/58d06745-e676-4e44-bd6c-037a2719c81d-kube-api-access-gwklt\") on node \"crc\" DevicePath \"\"" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.393434 4895 generic.go:334] "Generic (PLEG): container finished" podID="58d06745-e676-4e44-bd6c-037a2719c81d" containerID="4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68" exitCode=0 Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.393504 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerDied","Data":"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68"} Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.393824 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hljn" event={"ID":"58d06745-e676-4e44-bd6c-037a2719c81d","Type":"ContainerDied","Data":"7fa6df0b26c2af885bc2050a20ed186079cdc319e79cd7bcec4579ca8447e0df"} Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.393848 4895 scope.go:117] "RemoveContainer" containerID="4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.393517 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hljn" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.412302 4895 scope.go:117] "RemoveContainer" containerID="81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.439415 4895 scope.go:117] "RemoveContainer" containerID="601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.487599 4895 scope.go:117] "RemoveContainer" containerID="4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68" Mar 20 14:29:46 crc kubenswrapper[4895]: E0320 14:29:46.487976 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68\": container with ID starting with 4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68 not found: ID does not exist" containerID="4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.488025 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68"} err="failed to get container status \"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68\": rpc error: code = NotFound desc = could not find container \"4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68\": container with ID starting with 4ebb5703e5345e960f26ce2258708695005a07d9163f2120aadee2f499c85a68 not found: ID does not exist" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.488058 4895 scope.go:117] "RemoveContainer" containerID="81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4" Mar 20 14:29:46 crc kubenswrapper[4895]: E0320 14:29:46.488466 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4\": container with ID starting with 81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4 not found: ID does not exist" containerID="81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.488501 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4"} err="failed to get container status \"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4\": rpc error: code = NotFound desc = could not find container \"81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4\": container with ID starting with 81207601e3ca76bb68f35dee3b075fdae7d780486a19fe0d8090ecb991782da4 not found: ID does not exist" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.488530 4895 scope.go:117] "RemoveContainer" containerID="601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6" Mar 20 14:29:46 crc kubenswrapper[4895]: E0320 14:29:46.488845 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6\": container with ID starting with 601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6 not found: ID does not exist" containerID="601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.488883 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6"} err="failed to get container status \"601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6\": rpc error: code = NotFound desc = could not find container \"601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6\": container with ID starting with 601f6fe3e40f1f5e083c7261bd05f8dcd7da818700a400e4366e18006e3f0ff6 not found: ID does not exist" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.522192 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58d06745-e676-4e44-bd6c-037a2719c81d" (UID: "58d06745-e676-4e44-bd6c-037a2719c81d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.566894 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d06745-e676-4e44-bd6c-037a2719c81d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.727845 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:46 crc kubenswrapper[4895]: I0320 14:29:46.741195 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4hljn"] Mar 20 14:29:47 crc kubenswrapper[4895]: I0320 14:29:47.226273 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" path="/var/lib/kubelet/pods/58d06745-e676-4e44-bd6c-037a2719c81d/volumes" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.147245 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566950-cq72h"] Mar 20 14:30:00 crc kubenswrapper[4895]: E0320 14:30:00.148415 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="extract-content" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.148436 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="extract-content" Mar 20 14:30:00 crc kubenswrapper[4895]: E0320 14:30:00.148466 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="registry-server" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.148480 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="registry-server" Mar 20 14:30:00 crc kubenswrapper[4895]: E0320 14:30:00.148513 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="extract-utilities" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.148522 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="extract-utilities" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.148756 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d06745-e676-4e44-bd6c-037a2719c81d" containerName="registry-server" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.149618 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.151801 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.151870 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.152141 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.170184 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn"] Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.186335 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.197510 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.197763 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.224486 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-cq72h"] Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.249223 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.249637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2gh\" (UniqueName: \"kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh\") pod \"auto-csr-approver-29566950-cq72h\" (UID: \"15619572-9b53-480d-be8e-bc20bb54c891\") " pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.249934 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpkxc\" (UniqueName: \"kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.250125 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.252675 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn"] Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.352449 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.352533 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.352629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2gh\" (UniqueName: \"kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh\") pod \"auto-csr-approver-29566950-cq72h\" (UID: \"15619572-9b53-480d-be8e-bc20bb54c891\") " pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.352670 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpkxc\" (UniqueName: \"kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.354113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.359200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.372154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2gh\" (UniqueName: \"kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh\") pod \"auto-csr-approver-29566950-cq72h\" (UID: \"15619572-9b53-480d-be8e-bc20bb54c891\") " pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.374636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpkxc\" (UniqueName: \"kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc\") pod \"collect-profiles-29566950-vj8qn\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.481963 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.539296 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:00 crc kubenswrapper[4895]: I0320 14:30:00.954265 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-cq72h"] Mar 20 14:30:01 crc kubenswrapper[4895]: I0320 14:30:01.160132 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn"] Mar 20 14:30:01 crc kubenswrapper[4895]: I0320 14:30:01.528766 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" event={"ID":"87a90201-3492-4850-8da4-d07d434cb08e","Type":"ContainerStarted","Data":"99949ed9d0ed6d24e11c1997a30099b772178cd4caaaa70bb151e660b0b6db71"} Mar 20 14:30:01 crc kubenswrapper[4895]: I0320 14:30:01.529123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" event={"ID":"87a90201-3492-4850-8da4-d07d434cb08e","Type":"ContainerStarted","Data":"6acd3264a5f46f4ce1cc4b5544550d6adb09041cee453a959966cdcaada61349"} Mar 20 14:30:01 crc kubenswrapper[4895]: I0320 14:30:01.530562 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-cq72h" event={"ID":"15619572-9b53-480d-be8e-bc20bb54c891","Type":"ContainerStarted","Data":"f1e91224d1d31c315686486c14c4c5865e7294583d69d6b095a222278c96b3cd"} Mar 20 14:30:01 crc kubenswrapper[4895]: I0320 14:30:01.552045 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" podStartSLOduration=1.552023033 podStartE2EDuration="1.552023033s" podCreationTimestamp="2026-03-20 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:30:01.550801343 +0000 UTC m=+4101.060520339" watchObservedRunningTime="2026-03-20 14:30:01.552023033 +0000 UTC m=+4101.061741999" Mar 20 14:30:02 crc kubenswrapper[4895]: I0320 14:30:02.542582 4895 generic.go:334] "Generic (PLEG): container finished" podID="87a90201-3492-4850-8da4-d07d434cb08e" containerID="99949ed9d0ed6d24e11c1997a30099b772178cd4caaaa70bb151e660b0b6db71" exitCode=0 Mar 20 14:30:02 crc kubenswrapper[4895]: I0320 14:30:02.542656 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" event={"ID":"87a90201-3492-4850-8da4-d07d434cb08e","Type":"ContainerDied","Data":"99949ed9d0ed6d24e11c1997a30099b772178cd4caaaa70bb151e660b0b6db71"} Mar 20 14:30:03 crc kubenswrapper[4895]: I0320 14:30:03.567506 4895 generic.go:334] "Generic (PLEG): container finished" podID="15619572-9b53-480d-be8e-bc20bb54c891" containerID="29a5ab8fb201e50d5a38fa445f32b715a540addbc73d52a960d36d8add48cc09" exitCode=0 Mar 20 14:30:03 crc kubenswrapper[4895]: I0320 14:30:03.567564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-cq72h" event={"ID":"15619572-9b53-480d-be8e-bc20bb54c891","Type":"ContainerDied","Data":"29a5ab8fb201e50d5a38fa445f32b715a540addbc73d52a960d36d8add48cc09"} Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.014364 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.134621 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume\") pod \"87a90201-3492-4850-8da4-d07d434cb08e\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.134679 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpkxc\" (UniqueName: \"kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc\") pod \"87a90201-3492-4850-8da4-d07d434cb08e\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.134990 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume\") pod \"87a90201-3492-4850-8da4-d07d434cb08e\" (UID: \"87a90201-3492-4850-8da4-d07d434cb08e\") " Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.135496 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume" (OuterVolumeSpecName: "config-volume") pod "87a90201-3492-4850-8da4-d07d434cb08e" (UID: "87a90201-3492-4850-8da4-d07d434cb08e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.140571 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc" (OuterVolumeSpecName: "kube-api-access-kpkxc") pod "87a90201-3492-4850-8da4-d07d434cb08e" (UID: "87a90201-3492-4850-8da4-d07d434cb08e"). InnerVolumeSpecName "kube-api-access-kpkxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.140598 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87a90201-3492-4850-8da4-d07d434cb08e" (UID: "87a90201-3492-4850-8da4-d07d434cb08e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.235570 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9"] Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.237219 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87a90201-3492-4850-8da4-d07d434cb08e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.237351 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpkxc\" (UniqueName: \"kubernetes.io/projected/87a90201-3492-4850-8da4-d07d434cb08e-kube-api-access-kpkxc\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.237456 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87a90201-3492-4850-8da4-d07d434cb08e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.249465 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-d7bv9"] Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.581334 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.581326 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-vj8qn" event={"ID":"87a90201-3492-4850-8da4-d07d434cb08e","Type":"ContainerDied","Data":"6acd3264a5f46f4ce1cc4b5544550d6adb09041cee453a959966cdcaada61349"} Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.581459 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6acd3264a5f46f4ce1cc4b5544550d6adb09041cee453a959966cdcaada61349" Mar 20 14:30:04 crc kubenswrapper[4895]: I0320 14:30:04.956588 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.053315 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2gh\" (UniqueName: \"kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh\") pod \"15619572-9b53-480d-be8e-bc20bb54c891\" (UID: \"15619572-9b53-480d-be8e-bc20bb54c891\") " Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.059606 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh" (OuterVolumeSpecName: "kube-api-access-4t2gh") pod "15619572-9b53-480d-be8e-bc20bb54c891" (UID: "15619572-9b53-480d-be8e-bc20bb54c891"). InnerVolumeSpecName "kube-api-access-4t2gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.155935 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2gh\" (UniqueName: \"kubernetes.io/projected/15619572-9b53-480d-be8e-bc20bb54c891-kube-api-access-4t2gh\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.221954 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6273873f-7a10-4969-a18b-c041d9500d8b" path="/var/lib/kubelet/pods/6273873f-7a10-4969-a18b-c041d9500d8b/volumes" Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.592760 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-cq72h" event={"ID":"15619572-9b53-480d-be8e-bc20bb54c891","Type":"ContainerDied","Data":"f1e91224d1d31c315686486c14c4c5865e7294583d69d6b095a222278c96b3cd"} Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.593080 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e91224d1d31c315686486c14c4c5865e7294583d69d6b095a222278c96b3cd" Mar 20 14:30:05 crc kubenswrapper[4895]: I0320 14:30:05.593146 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-cq72h" Mar 20 14:30:06 crc kubenswrapper[4895]: I0320 14:30:06.014184 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-lgxm8"] Mar 20 14:30:06 crc kubenswrapper[4895]: I0320 14:30:06.024370 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-lgxm8"] Mar 20 14:30:07 crc kubenswrapper[4895]: I0320 14:30:07.226063 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b2b4a1-0e22-479c-9e15-527408b4b9cc" path="/var/lib/kubelet/pods/06b2b4a1-0e22-479c-9e15-527408b4b9cc/volumes" Mar 20 14:30:22 crc kubenswrapper[4895]: I0320 14:30:22.644116 4895 scope.go:117] "RemoveContainer" containerID="cb7b5d075ede93ffba649482b6c9dd9965b57fd370aba5b1913173d8f573b6dd" Mar 20 14:30:22 crc kubenswrapper[4895]: I0320 14:30:22.699937 4895 scope.go:117] "RemoveContainer" containerID="586bf06bb776838e6fc1201a63fb4f25cdb2767a45120f8b490c3af9bc1293cf" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.510705 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:30:50 crc kubenswrapper[4895]: E0320 14:30:50.514239 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a90201-3492-4850-8da4-d07d434cb08e" containerName="collect-profiles" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.514275 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a90201-3492-4850-8da4-d07d434cb08e" containerName="collect-profiles" Mar 20 14:30:50 crc kubenswrapper[4895]: E0320 14:30:50.514293 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15619572-9b53-480d-be8e-bc20bb54c891" containerName="oc" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.514300 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="15619572-9b53-480d-be8e-bc20bb54c891" containerName="oc" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.514554 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="15619572-9b53-480d-be8e-bc20bb54c891" containerName="oc" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.514587 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a90201-3492-4850-8da4-d07d434cb08e" containerName="collect-profiles" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.516226 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.529089 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.606418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m642\" (UniqueName: \"kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.607027 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.607127 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.709046 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.709139 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.709181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m642\" (UniqueName: \"kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.709635 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.709729 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.729494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m642\" (UniqueName: \"kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642\") pod \"redhat-marketplace-kpdgg\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:50 crc kubenswrapper[4895]: I0320 14:30:50.836552 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:30:51 crc kubenswrapper[4895]: I0320 14:30:51.350567 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:30:52 crc kubenswrapper[4895]: I0320 14:30:52.037357 4895 generic.go:334] "Generic (PLEG): container finished" podID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerID="3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7" exitCode=0 Mar 20 14:30:52 crc kubenswrapper[4895]: I0320 14:30:52.037420 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerDied","Data":"3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7"} Mar 20 14:30:52 crc kubenswrapper[4895]: I0320 14:30:52.037445 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerStarted","Data":"91994957af053ddbef3d79f07429a808ff09ee835d20ead4bcbb9c9106f4c625"} Mar 20 14:30:52 crc kubenswrapper[4895]: I0320 14:30:52.297240 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:30:52 crc kubenswrapper[4895]: I0320 14:30:52.297304 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:30:53 crc kubenswrapper[4895]: I0320 14:30:53.048657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerStarted","Data":"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b"} Mar 20 14:30:54 crc kubenswrapper[4895]: I0320 14:30:54.061903 4895 generic.go:334] "Generic (PLEG): container finished" podID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerID="7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b" exitCode=0 Mar 20 14:30:54 crc kubenswrapper[4895]: I0320 14:30:54.062002 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerDied","Data":"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b"} Mar 20 14:30:55 crc kubenswrapper[4895]: I0320 14:30:55.073893 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerStarted","Data":"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3"} Mar 20 14:30:55 crc kubenswrapper[4895]: I0320 14:30:55.102982 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kpdgg" podStartSLOduration=2.662326475 podStartE2EDuration="5.10296531s" podCreationTimestamp="2026-03-20 14:30:50 +0000 UTC" firstStartedPulling="2026-03-20 14:30:52.039090609 +0000 UTC m=+4151.548809575" lastFinishedPulling="2026-03-20 14:30:54.479729424 +0000 UTC m=+4153.989448410" observedRunningTime="2026-03-20 14:30:55.098575023 +0000 UTC m=+4154.608293989" watchObservedRunningTime="2026-03-20 14:30:55.10296531 +0000 UTC m=+4154.612684276" Mar 20 14:31:00 crc kubenswrapper[4895]: I0320 14:31:00.837258 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:00 crc kubenswrapper[4895]: I0320 14:31:00.838908 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:00 crc kubenswrapper[4895]: I0320 14:31:00.935545 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:01 crc kubenswrapper[4895]: I0320 14:31:01.192029 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:01 crc kubenswrapper[4895]: I0320 14:31:01.252385 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.161733 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kpdgg" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="registry-server" containerID="cri-o://01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3" gracePeriod=2 Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.744314 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.823043 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m642\" (UniqueName: \"kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642\") pod \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.823352 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content\") pod \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.823448 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities\") pod \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\" (UID: \"72f31a01-ec2b-4ad9-a984-3e0877f1f38f\") " Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.824363 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities" (OuterVolumeSpecName: "utilities") pod "72f31a01-ec2b-4ad9-a984-3e0877f1f38f" (UID: "72f31a01-ec2b-4ad9-a984-3e0877f1f38f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.832751 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642" (OuterVolumeSpecName: "kube-api-access-5m642") pod "72f31a01-ec2b-4ad9-a984-3e0877f1f38f" (UID: "72f31a01-ec2b-4ad9-a984-3e0877f1f38f"). InnerVolumeSpecName "kube-api-access-5m642". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.859811 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72f31a01-ec2b-4ad9-a984-3e0877f1f38f" (UID: "72f31a01-ec2b-4ad9-a984-3e0877f1f38f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.926423 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.926461 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m642\" (UniqueName: \"kubernetes.io/projected/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-kube-api-access-5m642\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:03 crc kubenswrapper[4895]: I0320 14:31:03.926478 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f31a01-ec2b-4ad9-a984-3e0877f1f38f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.172081 4895 generic.go:334] "Generic (PLEG): container finished" podID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerID="01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3" exitCode=0 Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.172129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerDied","Data":"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3"} Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.172181 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpdgg" event={"ID":"72f31a01-ec2b-4ad9-a984-3e0877f1f38f","Type":"ContainerDied","Data":"91994957af053ddbef3d79f07429a808ff09ee835d20ead4bcbb9c9106f4c625"} Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.172179 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpdgg" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.172215 4895 scope.go:117] "RemoveContainer" containerID="01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.196943 4895 scope.go:117] "RemoveContainer" containerID="7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.202939 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.212806 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpdgg"] Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.233741 4895 scope.go:117] "RemoveContainer" containerID="3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.271116 4895 scope.go:117] "RemoveContainer" containerID="01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3" Mar 20 14:31:04 crc kubenswrapper[4895]: E0320 14:31:04.271615 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3\": container with ID starting with 01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3 not found: ID does not exist" containerID="01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.271650 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3"} err="failed to get container status \"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3\": rpc error: code = NotFound desc = could not find container \"01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3\": container with ID starting with 01b49ed31d289f67af4b09fd3a320f036bc84c3a28a8ded77cd69df325b15cb3 not found: ID does not exist" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.271675 4895 scope.go:117] "RemoveContainer" containerID="7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b" Mar 20 14:31:04 crc kubenswrapper[4895]: E0320 14:31:04.271910 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b\": container with ID starting with 7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b not found: ID does not exist" containerID="7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.271942 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b"} err="failed to get container status \"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b\": rpc error: code = NotFound desc = could not find container \"7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b\": container with ID starting with 7e89d387bb1ac48aad6de35fad1c6b8d911f61358e5de77d31d26e4da64c1e8b not found: ID does not exist" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.271962 4895 scope.go:117] "RemoveContainer" containerID="3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7" Mar 20 14:31:04 crc kubenswrapper[4895]: E0320 14:31:04.272217 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7\": container with ID starting with 3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7 not found: ID does not exist" containerID="3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7" Mar 20 14:31:04 crc kubenswrapper[4895]: I0320 14:31:04.272244 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7"} err="failed to get container status \"3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7\": rpc error: code = NotFound desc = could not find container \"3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7\": container with ID starting with 3ea9bcf81f7c2c1363250444c91bdf27a5d814322b9e1a8eee0a73f44165baf7 not found: ID does not exist" Mar 20 14:31:05 crc kubenswrapper[4895]: I0320 14:31:05.223431 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" path="/var/lib/kubelet/pods/72f31a01-ec2b-4ad9-a984-3e0877f1f38f/volumes" Mar 20 14:31:22 crc kubenswrapper[4895]: I0320 14:31:22.297056 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:31:22 crc kubenswrapper[4895]: I0320 14:31:22.297746 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.297214 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.297825 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.297886 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.298632 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.298719 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c" gracePeriod=600 Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.699361 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c" exitCode=0 Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.699440 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c"} Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.699741 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07"} Mar 20 14:31:52 crc kubenswrapper[4895]: I0320 14:31:52.699764 4895 scope.go:117] "RemoveContainer" containerID="4052300fd1d6a1f6ee1c92d71552ac3e8eb98fa056d09ac3384fae9023826b7d" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.144440 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566952-4mz89"] Mar 20 14:32:00 crc kubenswrapper[4895]: E0320 14:32:00.145706 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.145727 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4895]: E0320 14:32:00.145761 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.145770 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4895]: E0320 14:32:00.145813 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.145821 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.146095 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f31a01-ec2b-4ad9-a984-3e0877f1f38f" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.147049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.150338 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.150881 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.150994 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.155991 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-4mz89"] Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.210648 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpw2x\" (UniqueName: \"kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x\") pod \"auto-csr-approver-29566952-4mz89\" (UID: \"44f0633f-eca4-4316-8e7e-6f1e4246802b\") " pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.312574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpw2x\" (UniqueName: \"kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x\") pod \"auto-csr-approver-29566952-4mz89\" (UID: \"44f0633f-eca4-4316-8e7e-6f1e4246802b\") " pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.335893 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpw2x\" (UniqueName: \"kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x\") pod \"auto-csr-approver-29566952-4mz89\" (UID: \"44f0633f-eca4-4316-8e7e-6f1e4246802b\") " pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.478947 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:00 crc kubenswrapper[4895]: I0320 14:32:00.952626 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-4mz89"] Mar 20 14:32:01 crc kubenswrapper[4895]: I0320 14:32:01.795129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-4mz89" event={"ID":"44f0633f-eca4-4316-8e7e-6f1e4246802b","Type":"ContainerStarted","Data":"740d71c627895a5ba22da8dfa673027e8f8393475c6198ae98ecb90a2b560c55"} Mar 20 14:32:02 crc kubenswrapper[4895]: I0320 14:32:02.814161 4895 generic.go:334] "Generic (PLEG): container finished" podID="44f0633f-eca4-4316-8e7e-6f1e4246802b" containerID="da0a9ae87bb8f0acf92922eefb6c1da8d67493bb8b21a24304b95740278a2d09" exitCode=0 Mar 20 14:32:02 crc kubenswrapper[4895]: I0320 14:32:02.814707 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-4mz89" event={"ID":"44f0633f-eca4-4316-8e7e-6f1e4246802b","Type":"ContainerDied","Data":"da0a9ae87bb8f0acf92922eefb6c1da8d67493bb8b21a24304b95740278a2d09"} Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.188243 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.324755 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpw2x\" (UniqueName: \"kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x\") pod \"44f0633f-eca4-4316-8e7e-6f1e4246802b\" (UID: \"44f0633f-eca4-4316-8e7e-6f1e4246802b\") " Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.331751 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x" (OuterVolumeSpecName: "kube-api-access-fpw2x") pod "44f0633f-eca4-4316-8e7e-6f1e4246802b" (UID: "44f0633f-eca4-4316-8e7e-6f1e4246802b"). InnerVolumeSpecName "kube-api-access-fpw2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.428336 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpw2x\" (UniqueName: \"kubernetes.io/projected/44f0633f-eca4-4316-8e7e-6f1e4246802b-kube-api-access-fpw2x\") on node \"crc\" DevicePath \"\"" Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.835279 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-4mz89" event={"ID":"44f0633f-eca4-4316-8e7e-6f1e4246802b","Type":"ContainerDied","Data":"740d71c627895a5ba22da8dfa673027e8f8393475c6198ae98ecb90a2b560c55"} Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.835714 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740d71c627895a5ba22da8dfa673027e8f8393475c6198ae98ecb90a2b560c55" Mar 20 14:32:04 crc kubenswrapper[4895]: I0320 14:32:04.835334 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-4mz89" Mar 20 14:32:05 crc kubenswrapper[4895]: I0320 14:32:05.264658 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-fltp7"] Mar 20 14:32:05 crc kubenswrapper[4895]: I0320 14:32:05.276426 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-fltp7"] Mar 20 14:32:07 crc kubenswrapper[4895]: I0320 14:32:07.222920 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9993560a-6827-4e64-9690-b2fdf2f91c8d" path="/var/lib/kubelet/pods/9993560a-6827-4e64-9690-b2fdf2f91c8d/volumes" Mar 20 14:32:22 crc kubenswrapper[4895]: I0320 14:32:22.813443 4895 scope.go:117] "RemoveContainer" containerID="d4dc636e22497b7319a4285023569d5fec67d298a2e2bddea27e60c518b8d329" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.139711 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:33:03 crc kubenswrapper[4895]: E0320 14:33:03.140838 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f0633f-eca4-4316-8e7e-6f1e4246802b" containerName="oc" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.140866 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f0633f-eca4-4316-8e7e-6f1e4246802b" containerName="oc" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.141135 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f0633f-eca4-4316-8e7e-6f1e4246802b" containerName="oc" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.142048 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.152242 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.152293 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.152550 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.152721 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fchkv" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.167374 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.295600 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.295740 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.295776 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.296045 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.296375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.296729 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdhw\" (UniqueName: \"kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.296994 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.297156 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.297215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.399793 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.399873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdhw\" (UniqueName: \"kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.399908 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.399959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.399985 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400070 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400360 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.400789 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.401217 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.405826 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.407505 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.415092 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.416098 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.418716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdhw\" (UniqueName: \"kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.436296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " pod="openstack/tempest-tests-tempest" Mar 20 14:33:03 crc kubenswrapper[4895]: I0320 14:33:03.471481 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:33:04 crc kubenswrapper[4895]: I0320 14:33:04.000810 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:33:04 crc kubenswrapper[4895]: I0320 14:33:04.006566 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:33:04 crc kubenswrapper[4895]: I0320 14:33:04.385806 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2acaa768-7497-437a-bd7d-46308eb5e0e2","Type":"ContainerStarted","Data":"5e00699812b8504e6d30b649c2b310121c4dde9edf208f4677404408bf0276aa"} Mar 20 14:33:30 crc kubenswrapper[4895]: E0320 14:33:30.143879 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 14:33:30 crc kubenswrapper[4895]: E0320 14:33:30.144720 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ggdhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(2acaa768-7497-437a-bd7d-46308eb5e0e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 14:33:30 crc kubenswrapper[4895]: E0320 14:33:30.146070 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="2acaa768-7497-437a-bd7d-46308eb5e0e2" Mar 20 14:33:30 crc kubenswrapper[4895]: E0320 14:33:30.637423 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="2acaa768-7497-437a-bd7d-46308eb5e0e2" Mar 20 14:33:42 crc kubenswrapper[4895]: I0320 14:33:42.685361 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 14:33:44 crc kubenswrapper[4895]: I0320 14:33:44.771479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2acaa768-7497-437a-bd7d-46308eb5e0e2","Type":"ContainerStarted","Data":"f299eacb03a9c62183ecabdce2cd481ad7f336fdb29525f2140f08490f1d3adc"} Mar 20 14:33:44 crc kubenswrapper[4895]: I0320 14:33:44.806523 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.130047001 podStartE2EDuration="42.806504056s" podCreationTimestamp="2026-03-20 14:33:02 +0000 UTC" firstStartedPulling="2026-03-20 14:33:04.006318667 +0000 UTC m=+4283.516037623" lastFinishedPulling="2026-03-20 14:33:42.682775712 +0000 UTC m=+4322.192494678" observedRunningTime="2026-03-20 14:33:44.788580749 +0000 UTC m=+4324.298299715" watchObservedRunningTime="2026-03-20 14:33:44.806504056 +0000 UTC m=+4324.316223022" Mar 20 14:33:52 crc kubenswrapper[4895]: I0320 14:33:52.299754 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:33:52 crc kubenswrapper[4895]: I0320 14:33:52.300343 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.151972 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566954-m495f"] Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.154452 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.156568 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.156888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.158540 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.162917 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-m495f"] Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.249534 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbhlj\" (UniqueName: \"kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj\") pod \"auto-csr-approver-29566954-m495f\" (UID: \"1027d17d-85ca-404c-9af3-2cef5d0b65b3\") " pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.351810 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbhlj\" (UniqueName: \"kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj\") pod \"auto-csr-approver-29566954-m495f\" (UID: \"1027d17d-85ca-404c-9af3-2cef5d0b65b3\") " pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:00 crc kubenswrapper[4895]: I0320 14:34:00.839313 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbhlj\" (UniqueName: \"kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj\") pod \"auto-csr-approver-29566954-m495f\" (UID: \"1027d17d-85ca-404c-9af3-2cef5d0b65b3\") " pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:01 crc kubenswrapper[4895]: I0320 14:34:01.081776 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:01 crc kubenswrapper[4895]: I0320 14:34:01.650937 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-m495f"] Mar 20 14:34:01 crc kubenswrapper[4895]: I0320 14:34:01.932701 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-m495f" event={"ID":"1027d17d-85ca-404c-9af3-2cef5d0b65b3","Type":"ContainerStarted","Data":"a24477a08a775fc39647766d110012d41c32a08d949b4249c67504919a78cf2a"} Mar 20 14:34:03 crc kubenswrapper[4895]: I0320 14:34:03.954507 4895 generic.go:334] "Generic (PLEG): container finished" podID="1027d17d-85ca-404c-9af3-2cef5d0b65b3" containerID="52aa608550d40774792e103aa42508f75c91e224e3b2a9cc09a79528c50ed27b" exitCode=0 Mar 20 14:34:03 crc kubenswrapper[4895]: I0320 14:34:03.954645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-m495f" event={"ID":"1027d17d-85ca-404c-9af3-2cef5d0b65b3","Type":"ContainerDied","Data":"52aa608550d40774792e103aa42508f75c91e224e3b2a9cc09a79528c50ed27b"} Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.382612 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.472344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbhlj\" (UniqueName: \"kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj\") pod \"1027d17d-85ca-404c-9af3-2cef5d0b65b3\" (UID: \"1027d17d-85ca-404c-9af3-2cef5d0b65b3\") " Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.493574 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj" (OuterVolumeSpecName: "kube-api-access-nbhlj") pod "1027d17d-85ca-404c-9af3-2cef5d0b65b3" (UID: "1027d17d-85ca-404c-9af3-2cef5d0b65b3"). InnerVolumeSpecName "kube-api-access-nbhlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.576814 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbhlj\" (UniqueName: \"kubernetes.io/projected/1027d17d-85ca-404c-9af3-2cef5d0b65b3-kube-api-access-nbhlj\") on node \"crc\" DevicePath \"\"" Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.974983 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-m495f" event={"ID":"1027d17d-85ca-404c-9af3-2cef5d0b65b3","Type":"ContainerDied","Data":"a24477a08a775fc39647766d110012d41c32a08d949b4249c67504919a78cf2a"} Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.975287 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24477a08a775fc39647766d110012d41c32a08d949b4249c67504919a78cf2a" Mar 20 14:34:05 crc kubenswrapper[4895]: I0320 14:34:05.975071 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-m495f" Mar 20 14:34:06 crc kubenswrapper[4895]: I0320 14:34:06.470198 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-w6qx8"] Mar 20 14:34:06 crc kubenswrapper[4895]: I0320 14:34:06.480440 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-w6qx8"] Mar 20 14:34:07 crc kubenswrapper[4895]: I0320 14:34:07.223503 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55b9290-2bd0-46f7-acc2-f5c763035c8c" path="/var/lib/kubelet/pods/e55b9290-2bd0-46f7-acc2-f5c763035c8c/volumes" Mar 20 14:34:22 crc kubenswrapper[4895]: I0320 14:34:22.297012 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:34:22 crc kubenswrapper[4895]: I0320 14:34:22.297599 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:34:22 crc kubenswrapper[4895]: I0320 14:34:22.925435 4895 scope.go:117] "RemoveContainer" containerID="62c8782aae693c07656c62a7632510574829646754fb327ce3a40e68fe2c4a9a" Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.296901 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.297541 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.297595 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.298627 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.298692 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" gracePeriod=600 Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.434306 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" exitCode=0 Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.434358 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07"} Mar 20 14:34:52 crc kubenswrapper[4895]: I0320 14:34:52.434413 4895 scope.go:117] "RemoveContainer" containerID="bcbed5dbab806af7eafbb95255e9e5688a2391a1781f0a8d858b4729f5c2d71c" Mar 20 14:34:52 crc kubenswrapper[4895]: E0320 14:34:52.443959 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:34:53 crc kubenswrapper[4895]: I0320 14:34:53.444936 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:34:53 crc kubenswrapper[4895]: E0320 14:34:53.445624 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:35:07 crc kubenswrapper[4895]: I0320 14:35:07.211614 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:35:07 crc kubenswrapper[4895]: E0320 14:35:07.212382 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:35:19 crc kubenswrapper[4895]: I0320 14:35:19.212659 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:35:19 crc kubenswrapper[4895]: E0320 14:35:19.213455 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:35:34 crc kubenswrapper[4895]: I0320 14:35:34.212714 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:35:34 crc kubenswrapper[4895]: E0320 14:35:34.213661 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:35:46 crc kubenswrapper[4895]: I0320 14:35:46.211696 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:35:46 crc kubenswrapper[4895]: E0320 14:35:46.212722 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:35:58 crc kubenswrapper[4895]: I0320 14:35:58.211914 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:35:58 crc kubenswrapper[4895]: E0320 14:35:58.212635 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.170310 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566956-jvqtm"] Mar 20 14:36:00 crc kubenswrapper[4895]: E0320 14:36:00.171433 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1027d17d-85ca-404c-9af3-2cef5d0b65b3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.171454 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1027d17d-85ca-404c-9af3-2cef5d0b65b3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.171874 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1027d17d-85ca-404c-9af3-2cef5d0b65b3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.174029 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.177241 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.177296 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.177343 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.198018 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-jvqtm"] Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.224381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwq6n\" (UniqueName: \"kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n\") pod \"auto-csr-approver-29566956-jvqtm\" (UID: \"0e7ffe79-c789-40c3-9222-26170907245b\") " pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.327103 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwq6n\" (UniqueName: \"kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n\") pod \"auto-csr-approver-29566956-jvqtm\" (UID: \"0e7ffe79-c789-40c3-9222-26170907245b\") " pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.347811 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwq6n\" (UniqueName: \"kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n\") pod \"auto-csr-approver-29566956-jvqtm\" (UID: \"0e7ffe79-c789-40c3-9222-26170907245b\") " pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:00 crc kubenswrapper[4895]: I0320 14:36:00.518768 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:01 crc kubenswrapper[4895]: I0320 14:36:01.007239 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-jvqtm"] Mar 20 14:36:01 crc kubenswrapper[4895]: I0320 14:36:01.087946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" event={"ID":"0e7ffe79-c789-40c3-9222-26170907245b","Type":"ContainerStarted","Data":"a26091ce1bd12ff64bf59aed6e26470de6b98ebda75e7e73a3ebd8b1bb65fc83"} Mar 20 14:36:03 crc kubenswrapper[4895]: I0320 14:36:03.112515 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" event={"ID":"0e7ffe79-c789-40c3-9222-26170907245b","Type":"ContainerStarted","Data":"67703a66ac7a148ac9d0c3cb078b4ee94df93ff47c8ee82094ae197ba61e8693"} Mar 20 14:36:03 crc kubenswrapper[4895]: I0320 14:36:03.129180 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" podStartSLOduration=1.920685515 podStartE2EDuration="3.129156259s" podCreationTimestamp="2026-03-20 14:36:00 +0000 UTC" firstStartedPulling="2026-03-20 14:36:01.036353149 +0000 UTC m=+4460.546072115" lastFinishedPulling="2026-03-20 14:36:02.244823893 +0000 UTC m=+4461.754542859" observedRunningTime="2026-03-20 14:36:03.126857182 +0000 UTC m=+4462.636576148" watchObservedRunningTime="2026-03-20 14:36:03.129156259 +0000 UTC m=+4462.638875225" Mar 20 14:36:04 crc kubenswrapper[4895]: I0320 14:36:04.122015 4895 generic.go:334] "Generic (PLEG): container finished" podID="0e7ffe79-c789-40c3-9222-26170907245b" containerID="67703a66ac7a148ac9d0c3cb078b4ee94df93ff47c8ee82094ae197ba61e8693" exitCode=0 Mar 20 14:36:04 crc kubenswrapper[4895]: I0320 14:36:04.122112 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" event={"ID":"0e7ffe79-c789-40c3-9222-26170907245b","Type":"ContainerDied","Data":"67703a66ac7a148ac9d0c3cb078b4ee94df93ff47c8ee82094ae197ba61e8693"} Mar 20 14:36:06 crc kubenswrapper[4895]: I0320 14:36:06.361207 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:06 crc kubenswrapper[4895]: I0320 14:36:06.482585 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwq6n\" (UniqueName: \"kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n\") pod \"0e7ffe79-c789-40c3-9222-26170907245b\" (UID: \"0e7ffe79-c789-40c3-9222-26170907245b\") " Mar 20 14:36:06 crc kubenswrapper[4895]: I0320 14:36:06.493694 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n" (OuterVolumeSpecName: "kube-api-access-nwq6n") pod "0e7ffe79-c789-40c3-9222-26170907245b" (UID: "0e7ffe79-c789-40c3-9222-26170907245b"). InnerVolumeSpecName "kube-api-access-nwq6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:36:06 crc kubenswrapper[4895]: I0320 14:36:06.585509 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwq6n\" (UniqueName: \"kubernetes.io/projected/0e7ffe79-c789-40c3-9222-26170907245b-kube-api-access-nwq6n\") on node \"crc\" DevicePath \"\"" Mar 20 14:36:07 crc kubenswrapper[4895]: I0320 14:36:07.166101 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" event={"ID":"0e7ffe79-c789-40c3-9222-26170907245b","Type":"ContainerDied","Data":"a26091ce1bd12ff64bf59aed6e26470de6b98ebda75e7e73a3ebd8b1bb65fc83"} Mar 20 14:36:07 crc kubenswrapper[4895]: I0320 14:36:07.166165 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26091ce1bd12ff64bf59aed6e26470de6b98ebda75e7e73a3ebd8b1bb65fc83" Mar 20 14:36:07 crc kubenswrapper[4895]: I0320 14:36:07.166221 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-jvqtm" Mar 20 14:36:07 crc kubenswrapper[4895]: I0320 14:36:07.438124 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-cq72h"] Mar 20 14:36:07 crc kubenswrapper[4895]: I0320 14:36:07.451067 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-cq72h"] Mar 20 14:36:09 crc kubenswrapper[4895]: I0320 14:36:09.223438 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15619572-9b53-480d-be8e-bc20bb54c891" path="/var/lib/kubelet/pods/15619572-9b53-480d-be8e-bc20bb54c891/volumes" Mar 20 14:36:10 crc kubenswrapper[4895]: I0320 14:36:10.211813 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:36:10 crc kubenswrapper[4895]: E0320 14:36:10.212152 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:36:23 crc kubenswrapper[4895]: I0320 14:36:23.024550 4895 scope.go:117] "RemoveContainer" containerID="29a5ab8fb201e50d5a38fa445f32b715a540addbc73d52a960d36d8add48cc09" Mar 20 14:36:25 crc kubenswrapper[4895]: I0320 14:36:25.212691 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:36:25 crc kubenswrapper[4895]: E0320 14:36:25.213718 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:36:40 crc kubenswrapper[4895]: I0320 14:36:40.211720 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:36:40 crc kubenswrapper[4895]: E0320 14:36:40.212552 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:36:52 crc kubenswrapper[4895]: I0320 14:36:52.211467 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:36:52 crc kubenswrapper[4895]: E0320 14:36:52.212420 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:37:04 crc kubenswrapper[4895]: I0320 14:37:04.212116 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:37:04 crc kubenswrapper[4895]: E0320 14:37:04.212975 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:37:16 crc kubenswrapper[4895]: I0320 14:37:16.212018 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:37:16 crc kubenswrapper[4895]: E0320 14:37:16.212699 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:37:29 crc kubenswrapper[4895]: I0320 14:37:29.216056 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:37:29 crc kubenswrapper[4895]: E0320 14:37:29.216820 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:37:43 crc kubenswrapper[4895]: I0320 14:37:43.212283 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:37:43 crc kubenswrapper[4895]: E0320 14:37:43.213102 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:37:54 crc kubenswrapper[4895]: I0320 14:37:54.212790 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:37:54 crc kubenswrapper[4895]: E0320 14:37:54.213548 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.151241 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566958-8r855"] Mar 20 14:38:00 crc kubenswrapper[4895]: E0320 14:38:00.152338 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7ffe79-c789-40c3-9222-26170907245b" containerName="oc" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.152353 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7ffe79-c789-40c3-9222-26170907245b" containerName="oc" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.152662 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7ffe79-c789-40c3-9222-26170907245b" containerName="oc" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.153621 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.155817 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.155896 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.156451 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.163279 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-8r855"] Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.230731 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqws\" (UniqueName: \"kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws\") pod \"auto-csr-approver-29566958-8r855\" (UID: \"e1211640-b38a-4c4f-883f-da719892cd75\") " pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.333826 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqws\" (UniqueName: \"kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws\") pod \"auto-csr-approver-29566958-8r855\" (UID: \"e1211640-b38a-4c4f-883f-da719892cd75\") " pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.357064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqws\" (UniqueName: \"kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws\") pod \"auto-csr-approver-29566958-8r855\" (UID: \"e1211640-b38a-4c4f-883f-da719892cd75\") " pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:00 crc kubenswrapper[4895]: I0320 14:38:00.488578 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:01 crc kubenswrapper[4895]: I0320 14:38:01.349280 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-8r855"] Mar 20 14:38:02 crc kubenswrapper[4895]: I0320 14:38:02.319522 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-8r855" event={"ID":"e1211640-b38a-4c4f-883f-da719892cd75","Type":"ContainerStarted","Data":"a4b0ba95fdad959eac2ee8d7da6b7976c63b5d1c22365ad7a5d15405e7094f23"} Mar 20 14:38:03 crc kubenswrapper[4895]: I0320 14:38:03.330900 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-8r855" event={"ID":"e1211640-b38a-4c4f-883f-da719892cd75","Type":"ContainerStarted","Data":"747d4815c8649671f611e5e1f7a1158660273dcd81b036577fd9d7f6b909464f"} Mar 20 14:38:03 crc kubenswrapper[4895]: I0320 14:38:03.352246 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566958-8r855" podStartSLOduration=2.5318844719999998 podStartE2EDuration="3.352222273s" podCreationTimestamp="2026-03-20 14:38:00 +0000 UTC" firstStartedPulling="2026-03-20 14:38:01.369225585 +0000 UTC m=+4580.878944551" lastFinishedPulling="2026-03-20 14:38:02.189563386 +0000 UTC m=+4581.699282352" observedRunningTime="2026-03-20 14:38:03.345689293 +0000 UTC m=+4582.855408259" watchObservedRunningTime="2026-03-20 14:38:03.352222273 +0000 UTC m=+4582.861941239" Mar 20 14:38:05 crc kubenswrapper[4895]: I0320 14:38:05.350348 4895 generic.go:334] "Generic (PLEG): container finished" podID="e1211640-b38a-4c4f-883f-da719892cd75" containerID="747d4815c8649671f611e5e1f7a1158660273dcd81b036577fd9d7f6b909464f" exitCode=0 Mar 20 14:38:05 crc kubenswrapper[4895]: I0320 14:38:05.350627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-8r855" event={"ID":"e1211640-b38a-4c4f-883f-da719892cd75","Type":"ContainerDied","Data":"747d4815c8649671f611e5e1f7a1158660273dcd81b036577fd9d7f6b909464f"} Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.378920 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-8r855" event={"ID":"e1211640-b38a-4c4f-883f-da719892cd75","Type":"ContainerDied","Data":"a4b0ba95fdad959eac2ee8d7da6b7976c63b5d1c22365ad7a5d15405e7094f23"} Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.379360 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b0ba95fdad959eac2ee8d7da6b7976c63b5d1c22365ad7a5d15405e7094f23" Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.441260 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.493533 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bqws\" (UniqueName: \"kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws\") pod \"e1211640-b38a-4c4f-883f-da719892cd75\" (UID: \"e1211640-b38a-4c4f-883f-da719892cd75\") " Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.509623 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws" (OuterVolumeSpecName: "kube-api-access-6bqws") pod "e1211640-b38a-4c4f-883f-da719892cd75" (UID: "e1211640-b38a-4c4f-883f-da719892cd75"). InnerVolumeSpecName "kube-api-access-6bqws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:38:07 crc kubenswrapper[4895]: I0320 14:38:07.600820 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bqws\" (UniqueName: \"kubernetes.io/projected/e1211640-b38a-4c4f-883f-da719892cd75-kube-api-access-6bqws\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:08 crc kubenswrapper[4895]: I0320 14:38:08.386798 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-8r855" Mar 20 14:38:08 crc kubenswrapper[4895]: I0320 14:38:08.562467 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-4mz89"] Mar 20 14:38:08 crc kubenswrapper[4895]: I0320 14:38:08.575189 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-4mz89"] Mar 20 14:38:09 crc kubenswrapper[4895]: I0320 14:38:09.217258 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:38:09 crc kubenswrapper[4895]: E0320 14:38:09.217830 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:38:09 crc kubenswrapper[4895]: I0320 14:38:09.232045 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f0633f-eca4-4316-8e7e-6f1e4246802b" path="/var/lib/kubelet/pods/44f0633f-eca4-4316-8e7e-6f1e4246802b/volumes" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.792709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:38:10 crc kubenswrapper[4895]: E0320 14:38:10.794241 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1211640-b38a-4c4f-883f-da719892cd75" containerName="oc" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.794261 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1211640-b38a-4c4f-883f-da719892cd75" containerName="oc" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.794592 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1211640-b38a-4c4f-883f-da719892cd75" containerName="oc" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.796729 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.804544 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.869637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.869742 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.869804 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdw4\" (UniqueName: \"kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.971982 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.972053 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.972121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdw4\" (UniqueName: \"kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.972600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.972652 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:10 crc kubenswrapper[4895]: I0320 14:38:10.993206 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdw4\" (UniqueName: \"kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4\") pod \"redhat-operators-sxlpn\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:11 crc kubenswrapper[4895]: I0320 14:38:11.131693 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:11 crc kubenswrapper[4895]: I0320 14:38:11.931104 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:38:12 crc kubenswrapper[4895]: I0320 14:38:12.439808 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerStarted","Data":"85d319b4299bc3e34b454a9a5e91e2c40f07047ab212d98c610773314d506054"} Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.451709 4895 generic.go:334] "Generic (PLEG): container finished" podID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerID="857c004da40133afa87a0dc4830887cb2c7d515fd12cdc840e58bf24559a5039" exitCode=0 Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.451810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerDied","Data":"857c004da40133afa87a0dc4830887cb2c7d515fd12cdc840e58bf24559a5039"} Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.454510 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.791067 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.794370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.820575 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.835927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7sm\" (UniqueName: \"kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.836051 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.836117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.938733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.938964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7sm\" (UniqueName: \"kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.939036 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.939215 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:13 crc kubenswrapper[4895]: I0320 14:38:13.939490 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:14 crc kubenswrapper[4895]: I0320 14:38:14.332044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7sm\" (UniqueName: \"kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm\") pod \"certified-operators-qg9zb\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:14 crc kubenswrapper[4895]: I0320 14:38:14.411085 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:15 crc kubenswrapper[4895]: I0320 14:38:15.328081 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:15 crc kubenswrapper[4895]: I0320 14:38:15.500124 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerStarted","Data":"1bd04c10ff9b2770289f0136d7e5d85cff8f127f140cda026137c18d676f6b69"} Mar 20 14:38:15 crc kubenswrapper[4895]: I0320 14:38:15.510179 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerStarted","Data":"4d2d147bcb385ea510c6a9f8d41d45896a2071f7aeb49d5f4c9672fbfe5862b3"} Mar 20 14:38:16 crc kubenswrapper[4895]: I0320 14:38:16.521882 4895 generic.go:334] "Generic (PLEG): container finished" podID="b6055199-823b-4385-8211-d3d6d46402ce" containerID="a1910bc1bacc4fa65a0051cef8ad3acf7c3f5e5e525b819690d1527d7be23f1c" exitCode=0 Mar 20 14:38:16 crc kubenswrapper[4895]: I0320 14:38:16.522063 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerDied","Data":"a1910bc1bacc4fa65a0051cef8ad3acf7c3f5e5e525b819690d1527d7be23f1c"} Mar 20 14:38:18 crc kubenswrapper[4895]: I0320 14:38:18.553609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerStarted","Data":"e9810a4214243afdc7ccb43da6b7b59dfd5edb85a2834d4729df8a36a8dd2d3d"} Mar 20 14:38:21 crc kubenswrapper[4895]: I0320 14:38:21.584107 4895 generic.go:334] "Generic (PLEG): container finished" podID="b6055199-823b-4385-8211-d3d6d46402ce" containerID="e9810a4214243afdc7ccb43da6b7b59dfd5edb85a2834d4729df8a36a8dd2d3d" exitCode=0 Mar 20 14:38:21 crc kubenswrapper[4895]: I0320 14:38:21.584247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerDied","Data":"e9810a4214243afdc7ccb43da6b7b59dfd5edb85a2834d4729df8a36a8dd2d3d"} Mar 20 14:38:22 crc kubenswrapper[4895]: I0320 14:38:22.212162 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:38:22 crc kubenswrapper[4895]: E0320 14:38:22.212730 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:38:22 crc kubenswrapper[4895]: I0320 14:38:22.601383 4895 generic.go:334] "Generic (PLEG): container finished" podID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerID="1bd04c10ff9b2770289f0136d7e5d85cff8f127f140cda026137c18d676f6b69" exitCode=0 Mar 20 14:38:22 crc kubenswrapper[4895]: I0320 14:38:22.601432 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerDied","Data":"1bd04c10ff9b2770289f0136d7e5d85cff8f127f140cda026137c18d676f6b69"} Mar 20 14:38:22 crc kubenswrapper[4895]: I0320 14:38:22.613941 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerStarted","Data":"240db7a25820f5a8c8387dfecdd28ce9e8bcf562142f297ad1ac761d515aca96"} Mar 20 14:38:22 crc kubenswrapper[4895]: I0320 14:38:22.650250 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qg9zb" podStartSLOduration=4.135917921 podStartE2EDuration="9.650219047s" podCreationTimestamp="2026-03-20 14:38:13 +0000 UTC" firstStartedPulling="2026-03-20 14:38:16.524660582 +0000 UTC m=+4596.034379548" lastFinishedPulling="2026-03-20 14:38:22.038961708 +0000 UTC m=+4601.548680674" observedRunningTime="2026-03-20 14:38:22.648609128 +0000 UTC m=+4602.158328094" watchObservedRunningTime="2026-03-20 14:38:22.650219047 +0000 UTC m=+4602.159938023" Mar 20 14:38:23 crc kubenswrapper[4895]: I0320 14:38:23.145844 4895 scope.go:117] "RemoveContainer" containerID="da0a9ae87bb8f0acf92922eefb6c1da8d67493bb8b21a24304b95740278a2d09" Mar 20 14:38:23 crc kubenswrapper[4895]: I0320 14:38:23.646274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerStarted","Data":"6f671712f0443e62c92a6f0b23418f98de61694fb04b9d890013b52fdfd2ab26"} Mar 20 14:38:23 crc kubenswrapper[4895]: I0320 14:38:23.680084 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sxlpn" podStartSLOduration=4.009263182 podStartE2EDuration="13.680065808s" podCreationTimestamp="2026-03-20 14:38:10 +0000 UTC" firstStartedPulling="2026-03-20 14:38:13.454022311 +0000 UTC m=+4592.963741277" lastFinishedPulling="2026-03-20 14:38:23.124824947 +0000 UTC m=+4602.634543903" observedRunningTime="2026-03-20 14:38:23.676784938 +0000 UTC m=+4603.186503924" watchObservedRunningTime="2026-03-20 14:38:23.680065808 +0000 UTC m=+4603.189784774" Mar 20 14:38:24 crc kubenswrapper[4895]: I0320 14:38:24.412660 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:24 crc kubenswrapper[4895]: I0320 14:38:24.413298 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:25 crc kubenswrapper[4895]: I0320 14:38:25.471938 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qg9zb" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="registry-server" probeResult="failure" output=< Mar 20 14:38:25 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:38:25 crc kubenswrapper[4895]: > Mar 20 14:38:31 crc kubenswrapper[4895]: I0320 14:38:31.132785 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:31 crc kubenswrapper[4895]: I0320 14:38:31.133409 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:38:32 crc kubenswrapper[4895]: I0320 14:38:32.193697 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sxlpn" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" probeResult="failure" output=< Mar 20 14:38:32 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:38:32 crc kubenswrapper[4895]: > Mar 20 14:38:33 crc kubenswrapper[4895]: I0320 14:38:33.211862 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:38:33 crc kubenswrapper[4895]: E0320 14:38:33.212513 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:38:34 crc kubenswrapper[4895]: I0320 14:38:34.470611 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:34 crc kubenswrapper[4895]: I0320 14:38:34.540860 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:34 crc kubenswrapper[4895]: I0320 14:38:34.718104 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:35 crc kubenswrapper[4895]: I0320 14:38:35.780484 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qg9zb" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="registry-server" containerID="cri-o://240db7a25820f5a8c8387dfecdd28ce9e8bcf562142f297ad1ac761d515aca96" gracePeriod=2 Mar 20 14:38:36 crc kubenswrapper[4895]: I0320 14:38:36.793667 4895 generic.go:334] "Generic (PLEG): container finished" podID="b6055199-823b-4385-8211-d3d6d46402ce" containerID="240db7a25820f5a8c8387dfecdd28ce9e8bcf562142f297ad1ac761d515aca96" exitCode=0 Mar 20 14:38:36 crc kubenswrapper[4895]: I0320 14:38:36.793755 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerDied","Data":"240db7a25820f5a8c8387dfecdd28ce9e8bcf562142f297ad1ac761d515aca96"} Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.151162 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.284626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities\") pod \"b6055199-823b-4385-8211-d3d6d46402ce\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.284672 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content\") pod \"b6055199-823b-4385-8211-d3d6d46402ce\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.284731 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc7sm\" (UniqueName: \"kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm\") pod \"b6055199-823b-4385-8211-d3d6d46402ce\" (UID: \"b6055199-823b-4385-8211-d3d6d46402ce\") " Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.287421 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities" (OuterVolumeSpecName: "utilities") pod "b6055199-823b-4385-8211-d3d6d46402ce" (UID: "b6055199-823b-4385-8211-d3d6d46402ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.320708 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm" (OuterVolumeSpecName: "kube-api-access-mc7sm") pod "b6055199-823b-4385-8211-d3d6d46402ce" (UID: "b6055199-823b-4385-8211-d3d6d46402ce"). InnerVolumeSpecName "kube-api-access-mc7sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.373582 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6055199-823b-4385-8211-d3d6d46402ce" (UID: "b6055199-823b-4385-8211-d3d6d46402ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.387526 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.387563 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6055199-823b-4385-8211-d3d6d46402ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.387577 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc7sm\" (UniqueName: \"kubernetes.io/projected/b6055199-823b-4385-8211-d3d6d46402ce-kube-api-access-mc7sm\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.807703 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qg9zb" event={"ID":"b6055199-823b-4385-8211-d3d6d46402ce","Type":"ContainerDied","Data":"4d2d147bcb385ea510c6a9f8d41d45896a2071f7aeb49d5f4c9672fbfe5862b3"} Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.807752 4895 scope.go:117] "RemoveContainer" containerID="240db7a25820f5a8c8387dfecdd28ce9e8bcf562142f297ad1ac761d515aca96" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.808769 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qg9zb" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.864652 4895 scope.go:117] "RemoveContainer" containerID="e9810a4214243afdc7ccb43da6b7b59dfd5edb85a2834d4729df8a36a8dd2d3d" Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.864775 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.884108 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qg9zb"] Mar 20 14:38:37 crc kubenswrapper[4895]: I0320 14:38:37.906959 4895 scope.go:117] "RemoveContainer" containerID="a1910bc1bacc4fa65a0051cef8ad3acf7c3f5e5e525b819690d1527d7be23f1c" Mar 20 14:38:39 crc kubenswrapper[4895]: I0320 14:38:39.240539 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6055199-823b-4385-8211-d3d6d46402ce" path="/var/lib/kubelet/pods/b6055199-823b-4385-8211-d3d6d46402ce/volumes" Mar 20 14:38:42 crc kubenswrapper[4895]: I0320 14:38:42.191935 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sxlpn" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" probeResult="failure" output=< Mar 20 14:38:42 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:38:42 crc kubenswrapper[4895]: > Mar 20 14:38:45 crc kubenswrapper[4895]: I0320 14:38:45.211906 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:38:45 crc kubenswrapper[4895]: E0320 14:38:45.213003 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:38:52 crc kubenswrapper[4895]: I0320 14:38:52.180900 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sxlpn" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" probeResult="failure" output=< Mar 20 14:38:52 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:38:52 crc kubenswrapper[4895]: > Mar 20 14:38:59 crc kubenswrapper[4895]: I0320 14:38:59.213977 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:38:59 crc kubenswrapper[4895]: E0320 14:38:59.214847 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:39:02 crc kubenswrapper[4895]: I0320 14:39:02.181418 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sxlpn" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" probeResult="failure" output=< Mar 20 14:39:02 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:39:02 crc kubenswrapper[4895]: > Mar 20 14:39:06 crc kubenswrapper[4895]: I0320 14:39:06.065316 4895 generic.go:334] "Generic (PLEG): container finished" podID="2acaa768-7497-437a-bd7d-46308eb5e0e2" containerID="f299eacb03a9c62183ecabdce2cd481ad7f336fdb29525f2140f08490f1d3adc" exitCode=0 Mar 20 14:39:06 crc kubenswrapper[4895]: I0320 14:39:06.065753 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2acaa768-7497-437a-bd7d-46308eb5e0e2","Type":"ContainerDied","Data":"f299eacb03a9c62183ecabdce2cd481ad7f336fdb29525f2140f08490f1d3adc"} Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.213166 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.258605 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.258720 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.258867 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.258898 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggdhw\" (UniqueName: \"kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.259005 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.259053 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.259099 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.259139 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.259176 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret\") pod \"2acaa768-7497-437a-bd7d-46308eb5e0e2\" (UID: \"2acaa768-7497-437a-bd7d-46308eb5e0e2\") " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.263981 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data" (OuterVolumeSpecName: "config-data") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.268776 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.269378 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.290027 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw" (OuterVolumeSpecName: "kube-api-access-ggdhw") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "kube-api-access-ggdhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.315567 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.315661 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.320624 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364056 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364097 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggdhw\" (UniqueName: \"kubernetes.io/projected/2acaa768-7497-437a-bd7d-46308eb5e0e2-kube-api-access-ggdhw\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364111 4895 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364122 4895 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364132 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364140 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.364148 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.414985 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.451739 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.466102 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.466147 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2acaa768-7497-437a-bd7d-46308eb5e0e2-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.844193 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "2acaa768-7497-437a-bd7d-46308eb5e0e2" (UID: "2acaa768-7497-437a-bd7d-46308eb5e0e2"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:39:08 crc kubenswrapper[4895]: I0320 14:39:08.875652 4895 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2acaa768-7497-437a-bd7d-46308eb5e0e2-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:09 crc kubenswrapper[4895]: I0320 14:39:09.092998 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2acaa768-7497-437a-bd7d-46308eb5e0e2","Type":"ContainerDied","Data":"5e00699812b8504e6d30b649c2b310121c4dde9edf208f4677404408bf0276aa"} Mar 20 14:39:09 crc kubenswrapper[4895]: I0320 14:39:09.093295 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e00699812b8504e6d30b649c2b310121c4dde9edf208f4677404408bf0276aa" Mar 20 14:39:09 crc kubenswrapper[4895]: I0320 14:39:09.093053 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:39:11 crc kubenswrapper[4895]: I0320 14:39:11.873223 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:39:11 crc kubenswrapper[4895]: I0320 14:39:11.936442 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:39:12 crc kubenswrapper[4895]: I0320 14:39:12.109443 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:39:13 crc kubenswrapper[4895]: I0320 14:39:13.129290 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sxlpn" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" containerID="cri-o://6f671712f0443e62c92a6f0b23418f98de61694fb04b9d890013b52fdfd2ab26" gracePeriod=2 Mar 20 14:39:14 crc kubenswrapper[4895]: I0320 14:39:14.177884 4895 generic.go:334] "Generic (PLEG): container finished" podID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerID="6f671712f0443e62c92a6f0b23418f98de61694fb04b9d890013b52fdfd2ab26" exitCode=0 Mar 20 14:39:14 crc kubenswrapper[4895]: I0320 14:39:14.178218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerDied","Data":"6f671712f0443e62c92a6f0b23418f98de61694fb04b9d890013b52fdfd2ab26"} Mar 20 14:39:14 crc kubenswrapper[4895]: I0320 14:39:14.217080 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:39:14 crc kubenswrapper[4895]: E0320 14:39:14.217343 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:39:14 crc kubenswrapper[4895]: I0320 14:39:14.903355 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.026361 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkdw4\" (UniqueName: \"kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4\") pod \"e11d832a-8876-4611-b38d-c20a3478b0d0\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.026746 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities\") pod \"e11d832a-8876-4611-b38d-c20a3478b0d0\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.026818 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content\") pod \"e11d832a-8876-4611-b38d-c20a3478b0d0\" (UID: \"e11d832a-8876-4611-b38d-c20a3478b0d0\") " Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.027460 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities" (OuterVolumeSpecName: "utilities") pod "e11d832a-8876-4611-b38d-c20a3478b0d0" (UID: "e11d832a-8876-4611-b38d-c20a3478b0d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.032821 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4" (OuterVolumeSpecName: "kube-api-access-lkdw4") pod "e11d832a-8876-4611-b38d-c20a3478b0d0" (UID: "e11d832a-8876-4611-b38d-c20a3478b0d0"). InnerVolumeSpecName "kube-api-access-lkdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.042547 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.042585 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkdw4\" (UniqueName: \"kubernetes.io/projected/e11d832a-8876-4611-b38d-c20a3478b0d0-kube-api-access-lkdw4\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.192854 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e11d832a-8876-4611-b38d-c20a3478b0d0" (UID: "e11d832a-8876-4611-b38d-c20a3478b0d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.193380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxlpn" event={"ID":"e11d832a-8876-4611-b38d-c20a3478b0d0","Type":"ContainerDied","Data":"85d319b4299bc3e34b454a9a5e91e2c40f07047ab212d98c610773314d506054"} Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.193458 4895 scope.go:117] "RemoveContainer" containerID="6f671712f0443e62c92a6f0b23418f98de61694fb04b9d890013b52fdfd2ab26" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.193636 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxlpn" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.229592 4895 scope.go:117] "RemoveContainer" containerID="1bd04c10ff9b2770289f0136d7e5d85cff8f127f140cda026137c18d676f6b69" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.246929 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11d832a-8876-4611-b38d-c20a3478b0d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.247355 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.262448 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sxlpn"] Mar 20 14:39:15 crc kubenswrapper[4895]: I0320 14:39:15.288067 4895 scope.go:117] "RemoveContainer" containerID="857c004da40133afa87a0dc4830887cb2c7d515fd12cdc840e58bf24559a5039" Mar 20 14:39:17 crc kubenswrapper[4895]: I0320 14:39:17.223873 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" path="/var/lib/kubelet/pods/e11d832a-8876-4611-b38d-c20a3478b0d0/volumes" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.492702 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493217 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="extract-utilities" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493233 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="extract-utilities" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493246 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493252 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493266 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="extract-content" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493274 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="extract-content" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493300 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="extract-content" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493306 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="extract-content" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493319 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acaa768-7497-437a-bd7d-46308eb5e0e2" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493325 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acaa768-7497-437a-bd7d-46308eb5e0e2" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493340 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493346 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: E0320 14:39:18.493359 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="extract-utilities" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493366 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="extract-utilities" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493599 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6055199-823b-4385-8211-d3d6d46402ce" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493619 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11d832a-8876-4611-b38d-c20a3478b0d0" containerName="registry-server" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.493636 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acaa768-7497-437a-bd7d-46308eb5e0e2" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.494379 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.501350 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fchkv" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.508996 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.614375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjj8f\" (UniqueName: \"kubernetes.io/projected/54ae286f-9132-4ea8-bdfa-3db69f52a13b-kube-api-access-xjj8f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.614531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.716771 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.717254 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.724437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjj8f\" (UniqueName: \"kubernetes.io/projected/54ae286f-9132-4ea8-bdfa-3db69f52a13b-kube-api-access-xjj8f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.751296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjj8f\" (UniqueName: \"kubernetes.io/projected/54ae286f-9132-4ea8-bdfa-3db69f52a13b-kube-api-access-xjj8f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.766642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54ae286f-9132-4ea8-bdfa-3db69f52a13b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:18 crc kubenswrapper[4895]: I0320 14:39:18.814120 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:39:19 crc kubenswrapper[4895]: I0320 14:39:19.544584 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:39:20 crc kubenswrapper[4895]: I0320 14:39:20.257548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"54ae286f-9132-4ea8-bdfa-3db69f52a13b","Type":"ContainerStarted","Data":"1765b7a2fb9442a53a9b21a56c8f9c1f69bc872521bb7d6b1ca743b3c33c3213"} Mar 20 14:39:21 crc kubenswrapper[4895]: I0320 14:39:21.278017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"54ae286f-9132-4ea8-bdfa-3db69f52a13b","Type":"ContainerStarted","Data":"f7f55aec73fafba1fe17fb1e8c9269304fc899405d24d181b454b5432630a2b5"} Mar 20 14:39:21 crc kubenswrapper[4895]: I0320 14:39:21.304716 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.314970189 podStartE2EDuration="3.304695289s" podCreationTimestamp="2026-03-20 14:39:18 +0000 UTC" firstStartedPulling="2026-03-20 14:39:19.557518496 +0000 UTC m=+4659.067237462" lastFinishedPulling="2026-03-20 14:39:20.547243596 +0000 UTC m=+4660.056962562" observedRunningTime="2026-03-20 14:39:21.29241536 +0000 UTC m=+4660.802134336" watchObservedRunningTime="2026-03-20 14:39:21.304695289 +0000 UTC m=+4660.814414265" Mar 20 14:39:28 crc kubenswrapper[4895]: I0320 14:39:28.212037 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:39:28 crc kubenswrapper[4895]: E0320 14:39:28.212911 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:39:40 crc kubenswrapper[4895]: I0320 14:39:40.213846 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:39:40 crc kubenswrapper[4895]: E0320 14:39:40.215024 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:39:51 crc kubenswrapper[4895]: I0320 14:39:51.219672 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:39:51 crc kubenswrapper[4895]: E0320 14:39:51.220534 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.193473 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566960-4vtlx"] Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.196881 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.201367 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.201914 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.217321 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.235441 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-4vtlx"] Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.361615 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxmg\" (UniqueName: \"kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg\") pod \"auto-csr-approver-29566960-4vtlx\" (UID: \"d4f9ea03-8b03-43c1-bb16-c5a66933bd62\") " pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.463888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxmg\" (UniqueName: \"kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg\") pod \"auto-csr-approver-29566960-4vtlx\" (UID: \"d4f9ea03-8b03-43c1-bb16-c5a66933bd62\") " pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:00 crc kubenswrapper[4895]: I0320 14:40:00.932287 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxmg\" (UniqueName: \"kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg\") pod \"auto-csr-approver-29566960-4vtlx\" (UID: \"d4f9ea03-8b03-43c1-bb16-c5a66933bd62\") " pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:01 crc kubenswrapper[4895]: I0320 14:40:01.131240 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:01 crc kubenswrapper[4895]: I0320 14:40:01.914682 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-4vtlx"] Mar 20 14:40:02 crc kubenswrapper[4895]: I0320 14:40:02.649123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" event={"ID":"d4f9ea03-8b03-43c1-bb16-c5a66933bd62","Type":"ContainerStarted","Data":"7b7d37ececb49dff5e28402fcbef7406f62c4a90d931727d91818405fa1d5e4c"} Mar 20 14:40:04 crc kubenswrapper[4895]: I0320 14:40:04.672203 4895 generic.go:334] "Generic (PLEG): container finished" podID="d4f9ea03-8b03-43c1-bb16-c5a66933bd62" containerID="924a5dbbde731e0cc136aaf1e53e308aae193be5ce3d0ebf268897457d14ab89" exitCode=0 Mar 20 14:40:04 crc kubenswrapper[4895]: I0320 14:40:04.672768 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" event={"ID":"d4f9ea03-8b03-43c1-bb16-c5a66933bd62","Type":"ContainerDied","Data":"924a5dbbde731e0cc136aaf1e53e308aae193be5ce3d0ebf268897457d14ab89"} Mar 20 14:40:06 crc kubenswrapper[4895]: I0320 14:40:06.212598 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:40:06 crc kubenswrapper[4895]: I0320 14:40:06.691934 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24"} Mar 20 14:40:06 crc kubenswrapper[4895]: I0320 14:40:06.924258 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.034329 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfxmg\" (UniqueName: \"kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg\") pod \"d4f9ea03-8b03-43c1-bb16-c5a66933bd62\" (UID: \"d4f9ea03-8b03-43c1-bb16-c5a66933bd62\") " Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.041900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg" (OuterVolumeSpecName: "kube-api-access-qfxmg") pod "d4f9ea03-8b03-43c1-bb16-c5a66933bd62" (UID: "d4f9ea03-8b03-43c1-bb16-c5a66933bd62"). InnerVolumeSpecName "kube-api-access-qfxmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.136608 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfxmg\" (UniqueName: \"kubernetes.io/projected/d4f9ea03-8b03-43c1-bb16-c5a66933bd62-kube-api-access-qfxmg\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.701484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" event={"ID":"d4f9ea03-8b03-43c1-bb16-c5a66933bd62","Type":"ContainerDied","Data":"7b7d37ececb49dff5e28402fcbef7406f62c4a90d931727d91818405fa1d5e4c"} Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.701769 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7d37ececb49dff5e28402fcbef7406f62c4a90d931727d91818405fa1d5e4c" Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.701590 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-4vtlx" Mar 20 14:40:07 crc kubenswrapper[4895]: I0320 14:40:07.996193 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-m495f"] Mar 20 14:40:08 crc kubenswrapper[4895]: I0320 14:40:08.009490 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-m495f"] Mar 20 14:40:09 crc kubenswrapper[4895]: I0320 14:40:09.236839 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1027d17d-85ca-404c-9af3-2cef5d0b65b3" path="/var/lib/kubelet/pods/1027d17d-85ca-404c-9af3-2cef5d0b65b3/volumes" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.710465 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rrdnn/must-gather-ssw99"] Mar 20 14:40:14 crc kubenswrapper[4895]: E0320 14:40:14.711578 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f9ea03-8b03-43c1-bb16-c5a66933bd62" containerName="oc" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.711599 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f9ea03-8b03-43c1-bb16-c5a66933bd62" containerName="oc" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.711882 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f9ea03-8b03-43c1-bb16-c5a66933bd62" containerName="oc" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.713362 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.716062 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rrdnn"/"default-dockercfg-2zhct" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.716236 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rrdnn"/"kube-root-ca.crt" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.717640 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rrdnn"/"openshift-service-ca.crt" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.726056 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rrdnn/must-gather-ssw99"] Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.794870 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whq47\" (UniqueName: \"kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.794986 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.896681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.896855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whq47\" (UniqueName: \"kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.897509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:14 crc kubenswrapper[4895]: I0320 14:40:14.920501 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whq47\" (UniqueName: \"kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47\") pod \"must-gather-ssw99\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:15 crc kubenswrapper[4895]: I0320 14:40:15.036304 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:40:15 crc kubenswrapper[4895]: I0320 14:40:15.824179 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rrdnn/must-gather-ssw99"] Mar 20 14:40:16 crc kubenswrapper[4895]: I0320 14:40:16.792915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/must-gather-ssw99" event={"ID":"f685eaf5-ecfb-4102-8d32-f200e5346700","Type":"ContainerStarted","Data":"95e07ff4d05f0c81e3f9a00155766899fdaefa3b68e24bf6fee9c867a35ca6b7"} Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.316040 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.320166 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.337446 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.455868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmg5v\" (UniqueName: \"kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.455945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.456296 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.558457 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.558608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmg5v\" (UniqueName: \"kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.558673 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.559104 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.559180 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.605271 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmg5v\" (UniqueName: \"kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v\") pod \"community-operators-42wp2\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:22 crc kubenswrapper[4895]: I0320 14:40:22.655036 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:23 crc kubenswrapper[4895]: I0320 14:40:23.376921 4895 scope.go:117] "RemoveContainer" containerID="52aa608550d40774792e103aa42508f75c91e224e3b2a9cc09a79528c50ed27b" Mar 20 14:40:27 crc kubenswrapper[4895]: I0320 14:40:27.910947 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/must-gather-ssw99" event={"ID":"f685eaf5-ecfb-4102-8d32-f200e5346700","Type":"ContainerStarted","Data":"5dab94cf46dc749b640bbf5d6a22fc44f58db65f0132f6b03fd3e95b324b704f"} Mar 20 14:40:27 crc kubenswrapper[4895]: I0320 14:40:27.985283 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:28 crc kubenswrapper[4895]: I0320 14:40:28.921519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/must-gather-ssw99" event={"ID":"f685eaf5-ecfb-4102-8d32-f200e5346700","Type":"ContainerStarted","Data":"5d7114f836d125b623b896e1877cf8d7e3c2ef38476b7e81e05e9d8a2001155a"} Mar 20 14:40:28 crc kubenswrapper[4895]: I0320 14:40:28.923610 4895 generic.go:334] "Generic (PLEG): container finished" podID="f6cc8520-40f7-4599-922e-43c014adc88c" containerID="d27ab5b57893040bf14c7c5de8d0b9804afdad59378a18bc9d823c794fde79c4" exitCode=0 Mar 20 14:40:28 crc kubenswrapper[4895]: I0320 14:40:28.923681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerDied","Data":"d27ab5b57893040bf14c7c5de8d0b9804afdad59378a18bc9d823c794fde79c4"} Mar 20 14:40:28 crc kubenswrapper[4895]: I0320 14:40:28.923737 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerStarted","Data":"63f38632efe224a60ae5c0a77227c3673ae8652dddf39b3bf756a8d84449cc03"} Mar 20 14:40:28 crc kubenswrapper[4895]: I0320 14:40:28.958843 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rrdnn/must-gather-ssw99" podStartSLOduration=3.615708616 podStartE2EDuration="14.958821714s" podCreationTimestamp="2026-03-20 14:40:14 +0000 UTC" firstStartedPulling="2026-03-20 14:40:15.805530493 +0000 UTC m=+4715.315249459" lastFinishedPulling="2026-03-20 14:40:27.148643591 +0000 UTC m=+4726.658362557" observedRunningTime="2026-03-20 14:40:28.952836148 +0000 UTC m=+4728.462555104" watchObservedRunningTime="2026-03-20 14:40:28.958821714 +0000 UTC m=+4728.468540690" Mar 20 14:40:30 crc kubenswrapper[4895]: I0320 14:40:30.942413 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerStarted","Data":"2055c8f26ae8f2aadc9c7f8ccdabfde23a1a3296e6074e0e72f032eaad4a3306"} Mar 20 14:40:31 crc kubenswrapper[4895]: I0320 14:40:31.952206 4895 generic.go:334] "Generic (PLEG): container finished" podID="f6cc8520-40f7-4599-922e-43c014adc88c" containerID="2055c8f26ae8f2aadc9c7f8ccdabfde23a1a3296e6074e0e72f032eaad4a3306" exitCode=0 Mar 20 14:40:31 crc kubenswrapper[4895]: I0320 14:40:31.952289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerDied","Data":"2055c8f26ae8f2aadc9c7f8ccdabfde23a1a3296e6074e0e72f032eaad4a3306"} Mar 20 14:40:32 crc kubenswrapper[4895]: I0320 14:40:32.968933 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerStarted","Data":"a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b"} Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.260988 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42wp2" podStartSLOduration=8.792346146 podStartE2EDuration="12.260970654s" podCreationTimestamp="2026-03-20 14:40:22 +0000 UTC" firstStartedPulling="2026-03-20 14:40:28.925744015 +0000 UTC m=+4728.435462981" lastFinishedPulling="2026-03-20 14:40:32.394368513 +0000 UTC m=+4731.904087489" observedRunningTime="2026-03-20 14:40:33.00476233 +0000 UTC m=+4732.514481306" watchObservedRunningTime="2026-03-20 14:40:34.260970654 +0000 UTC m=+4733.770689620" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.266832 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-6xwv5"] Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.268257 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.369564 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn986\" (UniqueName: \"kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.370053 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.471992 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.472127 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn986\" (UniqueName: \"kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.472657 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.499184 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn986\" (UniqueName: \"kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986\") pod \"crc-debug-6xwv5\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.586144 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:40:34 crc kubenswrapper[4895]: W0320 14:40:34.637552 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0308bdf2_0a20_4a43_a1cf_58b570393bc1.slice/crio-6e4c2832c74cf83900c674c596b1fa54d6f7880470b1d5cfb66ed293f55879c0 WatchSource:0}: Error finding container 6e4c2832c74cf83900c674c596b1fa54d6f7880470b1d5cfb66ed293f55879c0: Status 404 returned error can't find the container with id 6e4c2832c74cf83900c674c596b1fa54d6f7880470b1d5cfb66ed293f55879c0 Mar 20 14:40:34 crc kubenswrapper[4895]: I0320 14:40:34.987352 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" event={"ID":"0308bdf2-0a20-4a43-a1cf-58b570393bc1","Type":"ContainerStarted","Data":"6e4c2832c74cf83900c674c596b1fa54d6f7880470b1d5cfb66ed293f55879c0"} Mar 20 14:40:42 crc kubenswrapper[4895]: I0320 14:40:42.655546 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:42 crc kubenswrapper[4895]: I0320 14:40:42.656147 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:42 crc kubenswrapper[4895]: I0320 14:40:42.754292 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:43 crc kubenswrapper[4895]: I0320 14:40:43.183913 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:43 crc kubenswrapper[4895]: I0320 14:40:43.248346 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:45 crc kubenswrapper[4895]: I0320 14:40:45.138193 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42wp2" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="registry-server" containerID="cri-o://a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" gracePeriod=2 Mar 20 14:40:46 crc kubenswrapper[4895]: I0320 14:40:46.162842 4895 generic.go:334] "Generic (PLEG): container finished" podID="f6cc8520-40f7-4599-922e-43c014adc88c" containerID="a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" exitCode=0 Mar 20 14:40:46 crc kubenswrapper[4895]: I0320 14:40:46.162929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerDied","Data":"a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b"} Mar 20 14:40:52 crc kubenswrapper[4895]: E0320 14:40:52.656554 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b is running failed: container process not found" containerID="a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:40:52 crc kubenswrapper[4895]: E0320 14:40:52.657513 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b is running failed: container process not found" containerID="a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:40:52 crc kubenswrapper[4895]: E0320 14:40:52.657820 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b is running failed: container process not found" containerID="a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:40:52 crc kubenswrapper[4895]: E0320 14:40:52.657888 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-42wp2" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="registry-server" Mar 20 14:40:56 crc kubenswrapper[4895]: E0320 14:40:56.336294 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 20 14:40:56 crc kubenswrapper[4895]: E0320 14:40:56.337225 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn986,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-6xwv5_openshift-must-gather-rrdnn(0308bdf2-0a20-4a43-a1cf-58b570393bc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 14:40:56 crc kubenswrapper[4895]: E0320 14:40:56.338502 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" Mar 20 14:40:57 crc kubenswrapper[4895]: E0320 14:40:57.320537 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.624404 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.731117 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities\") pod \"f6cc8520-40f7-4599-922e-43c014adc88c\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.731220 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content\") pod \"f6cc8520-40f7-4599-922e-43c014adc88c\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.731324 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmg5v\" (UniqueName: \"kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v\") pod \"f6cc8520-40f7-4599-922e-43c014adc88c\" (UID: \"f6cc8520-40f7-4599-922e-43c014adc88c\") " Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.754685 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities" (OuterVolumeSpecName: "utilities") pod "f6cc8520-40f7-4599-922e-43c014adc88c" (UID: "f6cc8520-40f7-4599-922e-43c014adc88c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.798023 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v" (OuterVolumeSpecName: "kube-api-access-zmg5v") pod "f6cc8520-40f7-4599-922e-43c014adc88c" (UID: "f6cc8520-40f7-4599-922e-43c014adc88c"). InnerVolumeSpecName "kube-api-access-zmg5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.835058 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.835099 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmg5v\" (UniqueName: \"kubernetes.io/projected/f6cc8520-40f7-4599-922e-43c014adc88c-kube-api-access-zmg5v\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.835645 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6cc8520-40f7-4599-922e-43c014adc88c" (UID: "f6cc8520-40f7-4599-922e-43c014adc88c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:40:57 crc kubenswrapper[4895]: I0320 14:40:57.937076 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cc8520-40f7-4599-922e-43c014adc88c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.327005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42wp2" event={"ID":"f6cc8520-40f7-4599-922e-43c014adc88c","Type":"ContainerDied","Data":"63f38632efe224a60ae5c0a77227c3673ae8652dddf39b3bf756a8d84449cc03"} Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.327067 4895 scope.go:117] "RemoveContainer" containerID="a443d9139815cc97b2db90d413515c71c9c400ed4d30b723e365cd2434d9167b" Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.327093 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42wp2" Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.352076 4895 scope.go:117] "RemoveContainer" containerID="2055c8f26ae8f2aadc9c7f8ccdabfde23a1a3296e6074e0e72f032eaad4a3306" Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.376699 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.381322 4895 scope.go:117] "RemoveContainer" containerID="d27ab5b57893040bf14c7c5de8d0b9804afdad59378a18bc9d823c794fde79c4" Mar 20 14:40:58 crc kubenswrapper[4895]: I0320 14:40:58.385766 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42wp2"] Mar 20 14:40:59 crc kubenswrapper[4895]: I0320 14:40:59.241588 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" path="/var/lib/kubelet/pods/f6cc8520-40f7-4599-922e-43c014adc88c/volumes" Mar 20 14:41:11 crc kubenswrapper[4895]: I0320 14:41:11.468998 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" event={"ID":"0308bdf2-0a20-4a43-a1cf-58b570393bc1","Type":"ContainerStarted","Data":"0134536c4bd4061074fa5b4fc912a61aff201d2779aafa83672f34ef96f14c18"} Mar 20 14:41:11 crc kubenswrapper[4895]: I0320 14:41:11.495568 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" podStartSLOduration=1.5007028340000002 podStartE2EDuration="37.49555057s" podCreationTimestamp="2026-03-20 14:40:34 +0000 UTC" firstStartedPulling="2026-03-20 14:40:34.639784052 +0000 UTC m=+4734.149503018" lastFinishedPulling="2026-03-20 14:41:10.634631788 +0000 UTC m=+4770.144350754" observedRunningTime="2026-03-20 14:41:11.489658836 +0000 UTC m=+4770.999377802" watchObservedRunningTime="2026-03-20 14:41:11.49555057 +0000 UTC m=+4771.005269536" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.576170 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:41:46 crc kubenswrapper[4895]: E0320 14:41:46.577793 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="extract-utilities" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.577814 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="extract-utilities" Mar 20 14:41:46 crc kubenswrapper[4895]: E0320 14:41:46.577841 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="registry-server" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.577849 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="registry-server" Mar 20 14:41:46 crc kubenswrapper[4895]: E0320 14:41:46.577862 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="extract-content" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.577869 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="extract-content" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.578140 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cc8520-40f7-4599-922e-43c014adc88c" containerName="registry-server" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.579966 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.666408 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.753645 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t45nv\" (UniqueName: \"kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.753707 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.754149 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.855619 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.855725 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t45nv\" (UniqueName: \"kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.855784 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.856405 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.856643 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.889461 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t45nv\" (UniqueName: \"kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv\") pod \"redhat-marketplace-4k6wx\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:46 crc kubenswrapper[4895]: I0320 14:41:46.899822 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:47 crc kubenswrapper[4895]: I0320 14:41:47.788002 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:41:47 crc kubenswrapper[4895]: I0320 14:41:47.907839 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerStarted","Data":"d5b898fab3deed33ac0652ad930c3e049d749d4f20a330ce5835c5a2ea6e380b"} Mar 20 14:41:48 crc kubenswrapper[4895]: I0320 14:41:48.917509 4895 generic.go:334] "Generic (PLEG): container finished" podID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerID="d053b149db569aa6397b491d8926ab86f450c0aaf307b4f666a2c63fa82ed285" exitCode=0 Mar 20 14:41:48 crc kubenswrapper[4895]: I0320 14:41:48.917597 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerDied","Data":"d053b149db569aa6397b491d8926ab86f450c0aaf307b4f666a2c63fa82ed285"} Mar 20 14:41:49 crc kubenswrapper[4895]: I0320 14:41:49.928272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerStarted","Data":"89458631a3f7218cb325afe708ebb3f0abb4c9296f9d75ad42e205e0d34f51f7"} Mar 20 14:41:51 crc kubenswrapper[4895]: I0320 14:41:51.948950 4895 generic.go:334] "Generic (PLEG): container finished" podID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerID="89458631a3f7218cb325afe708ebb3f0abb4c9296f9d75ad42e205e0d34f51f7" exitCode=0 Mar 20 14:41:51 crc kubenswrapper[4895]: I0320 14:41:51.949039 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerDied","Data":"89458631a3f7218cb325afe708ebb3f0abb4c9296f9d75ad42e205e0d34f51f7"} Mar 20 14:41:53 crc kubenswrapper[4895]: I0320 14:41:53.973244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerStarted","Data":"175ddf25784d43fc989034c536cfeb48b9d971d37691fc477c5f1247bc27e9f6"} Mar 20 14:41:53 crc kubenswrapper[4895]: I0320 14:41:53.997945 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4k6wx" podStartSLOduration=4.547899543 podStartE2EDuration="7.997922486s" podCreationTimestamp="2026-03-20 14:41:46 +0000 UTC" firstStartedPulling="2026-03-20 14:41:48.91934787 +0000 UTC m=+4808.429066836" lastFinishedPulling="2026-03-20 14:41:52.369370813 +0000 UTC m=+4811.879089779" observedRunningTime="2026-03-20 14:41:53.994205485 +0000 UTC m=+4813.503924451" watchObservedRunningTime="2026-03-20 14:41:53.997922486 +0000 UTC m=+4813.507641452" Mar 20 14:41:56 crc kubenswrapper[4895]: I0320 14:41:56.900377 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:56 crc kubenswrapper[4895]: I0320 14:41:56.900903 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:41:56 crc kubenswrapper[4895]: I0320 14:41:56.965381 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.156473 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566962-b5kzh"] Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.158759 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.161017 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.161200 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.161370 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.175586 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-b5kzh"] Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.246305 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nswfq\" (UniqueName: \"kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq\") pod \"auto-csr-approver-29566962-b5kzh\" (UID: \"c5f07f4f-3d46-49e1-87d9-177e65622278\") " pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.349002 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nswfq\" (UniqueName: \"kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq\") pod \"auto-csr-approver-29566962-b5kzh\" (UID: \"c5f07f4f-3d46-49e1-87d9-177e65622278\") " pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.374583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nswfq\" (UniqueName: \"kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq\") pod \"auto-csr-approver-29566962-b5kzh\" (UID: \"c5f07f4f-3d46-49e1-87d9-177e65622278\") " pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:00 crc kubenswrapper[4895]: I0320 14:42:00.475897 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:01 crc kubenswrapper[4895]: I0320 14:42:01.269769 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-b5kzh"] Mar 20 14:42:02 crc kubenswrapper[4895]: I0320 14:42:02.063216 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" event={"ID":"c5f07f4f-3d46-49e1-87d9-177e65622278","Type":"ContainerStarted","Data":"c61ace69b31ed518247aa0f9060b97152498b946bda2ffa18865ac87143b8978"} Mar 20 14:42:04 crc kubenswrapper[4895]: I0320 14:42:04.099242 4895 generic.go:334] "Generic (PLEG): container finished" podID="c5f07f4f-3d46-49e1-87d9-177e65622278" containerID="132e8c907f12cd9d7eef4c7cc1e385c11ff6101346ca010c9e1ac3172ebf1f2f" exitCode=0 Mar 20 14:42:04 crc kubenswrapper[4895]: I0320 14:42:04.099720 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" event={"ID":"c5f07f4f-3d46-49e1-87d9-177e65622278","Type":"ContainerDied","Data":"132e8c907f12cd9d7eef4c7cc1e385c11ff6101346ca010c9e1ac3172ebf1f2f"} Mar 20 14:42:06 crc kubenswrapper[4895]: I0320 14:42:06.400132 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:06 crc kubenswrapper[4895]: I0320 14:42:06.506434 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nswfq\" (UniqueName: \"kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq\") pod \"c5f07f4f-3d46-49e1-87d9-177e65622278\" (UID: \"c5f07f4f-3d46-49e1-87d9-177e65622278\") " Mar 20 14:42:06 crc kubenswrapper[4895]: I0320 14:42:06.514638 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq" (OuterVolumeSpecName: "kube-api-access-nswfq") pod "c5f07f4f-3d46-49e1-87d9-177e65622278" (UID: "c5f07f4f-3d46-49e1-87d9-177e65622278"). InnerVolumeSpecName "kube-api-access-nswfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:06 crc kubenswrapper[4895]: I0320 14:42:06.609509 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nswfq\" (UniqueName: \"kubernetes.io/projected/c5f07f4f-3d46-49e1-87d9-177e65622278-kube-api-access-nswfq\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:06 crc kubenswrapper[4895]: I0320 14:42:06.979992 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.050757 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.130279 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" event={"ID":"c5f07f4f-3d46-49e1-87d9-177e65622278","Type":"ContainerDied","Data":"c61ace69b31ed518247aa0f9060b97152498b946bda2ffa18865ac87143b8978"} Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.130332 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61ace69b31ed518247aa0f9060b97152498b946bda2ffa18865ac87143b8978" Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.130295 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-b5kzh" Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.130433 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4k6wx" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="registry-server" containerID="cri-o://175ddf25784d43fc989034c536cfeb48b9d971d37691fc477c5f1247bc27e9f6" gracePeriod=2 Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.492451 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-jvqtm"] Mar 20 14:42:07 crc kubenswrapper[4895]: I0320 14:42:07.502361 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-jvqtm"] Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.140673 4895 generic.go:334] "Generic (PLEG): container finished" podID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerID="175ddf25784d43fc989034c536cfeb48b9d971d37691fc477c5f1247bc27e9f6" exitCode=0 Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.140725 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerDied","Data":"175ddf25784d43fc989034c536cfeb48b9d971d37691fc477c5f1247bc27e9f6"} Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.352459 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.453370 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t45nv\" (UniqueName: \"kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv\") pod \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.453453 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content\") pod \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.453550 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities\") pod \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\" (UID: \"c473dbb3-fb53-42a4-b67e-92cbc732f3ca\") " Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.454798 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities" (OuterVolumeSpecName: "utilities") pod "c473dbb3-fb53-42a4-b67e-92cbc732f3ca" (UID: "c473dbb3-fb53-42a4-b67e-92cbc732f3ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.461647 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv" (OuterVolumeSpecName: "kube-api-access-t45nv") pod "c473dbb3-fb53-42a4-b67e-92cbc732f3ca" (UID: "c473dbb3-fb53-42a4-b67e-92cbc732f3ca"). InnerVolumeSpecName "kube-api-access-t45nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.499879 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c473dbb3-fb53-42a4-b67e-92cbc732f3ca" (UID: "c473dbb3-fb53-42a4-b67e-92cbc732f3ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.562240 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t45nv\" (UniqueName: \"kubernetes.io/projected/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-kube-api-access-t45nv\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.562283 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:08 crc kubenswrapper[4895]: I0320 14:42:08.562297 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c473dbb3-fb53-42a4-b67e-92cbc732f3ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.151535 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4k6wx" event={"ID":"c473dbb3-fb53-42a4-b67e-92cbc732f3ca","Type":"ContainerDied","Data":"d5b898fab3deed33ac0652ad930c3e049d749d4f20a330ce5835c5a2ea6e380b"} Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.151592 4895 scope.go:117] "RemoveContainer" containerID="175ddf25784d43fc989034c536cfeb48b9d971d37691fc477c5f1247bc27e9f6" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.151622 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4k6wx" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.179090 4895 scope.go:117] "RemoveContainer" containerID="89458631a3f7218cb325afe708ebb3f0abb4c9296f9d75ad42e205e0d34f51f7" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.189842 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.200281 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4k6wx"] Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.236981 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7ffe79-c789-40c3-9222-26170907245b" path="/var/lib/kubelet/pods/0e7ffe79-c789-40c3-9222-26170907245b/volumes" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.237796 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" path="/var/lib/kubelet/pods/c473dbb3-fb53-42a4-b67e-92cbc732f3ca/volumes" Mar 20 14:42:09 crc kubenswrapper[4895]: I0320 14:42:09.563482 4895 scope.go:117] "RemoveContainer" containerID="d053b149db569aa6397b491d8926ab86f450c0aaf307b4f666a2c63fa82ed285" Mar 20 14:42:22 crc kubenswrapper[4895]: I0320 14:42:22.297172 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:42:22 crc kubenswrapper[4895]: I0320 14:42:22.297853 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:42:24 crc kubenswrapper[4895]: I0320 14:42:24.301593 4895 generic.go:334] "Generic (PLEG): container finished" podID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" containerID="0134536c4bd4061074fa5b4fc912a61aff201d2779aafa83672f34ef96f14c18" exitCode=0 Mar 20 14:42:24 crc kubenswrapper[4895]: I0320 14:42:24.301692 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" event={"ID":"0308bdf2-0a20-4a43-a1cf-58b570393bc1","Type":"ContainerDied","Data":"0134536c4bd4061074fa5b4fc912a61aff201d2779aafa83672f34ef96f14c18"} Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.441357 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.482835 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-6xwv5"] Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.493119 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-6xwv5"] Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.507791 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host\") pod \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.507874 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host" (OuterVolumeSpecName: "host") pod "0308bdf2-0a20-4a43-a1cf-58b570393bc1" (UID: "0308bdf2-0a20-4a43-a1cf-58b570393bc1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.508060 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn986\" (UniqueName: \"kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986\") pod \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\" (UID: \"0308bdf2-0a20-4a43-a1cf-58b570393bc1\") " Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.508498 4895 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0308bdf2-0a20-4a43-a1cf-58b570393bc1-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.512966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986" (OuterVolumeSpecName: "kube-api-access-vn986") pod "0308bdf2-0a20-4a43-a1cf-58b570393bc1" (UID: "0308bdf2-0a20-4a43-a1cf-58b570393bc1"). InnerVolumeSpecName "kube-api-access-vn986". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:25 crc kubenswrapper[4895]: I0320 14:42:25.610577 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn986\" (UniqueName: \"kubernetes.io/projected/0308bdf2-0a20-4a43-a1cf-58b570393bc1-kube-api-access-vn986\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:26 crc kubenswrapper[4895]: I0320 14:42:26.332360 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4c2832c74cf83900c674c596b1fa54d6f7880470b1d5cfb66ed293f55879c0" Mar 20 14:42:26 crc kubenswrapper[4895]: I0320 14:42:26.332444 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-6xwv5" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007168 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-wtrrn"] Mar 20 14:42:27 crc kubenswrapper[4895]: E0320 14:42:27.007588 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f07f4f-3d46-49e1-87d9-177e65622278" containerName="oc" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007599 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f07f4f-3d46-49e1-87d9-177e65622278" containerName="oc" Mar 20 14:42:27 crc kubenswrapper[4895]: E0320 14:42:27.007622 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="extract-utilities" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007628 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="extract-utilities" Mar 20 14:42:27 crc kubenswrapper[4895]: E0320 14:42:27.007640 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="extract-content" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007648 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="extract-content" Mar 20 14:42:27 crc kubenswrapper[4895]: E0320 14:42:27.007659 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="registry-server" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007664 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="registry-server" Mar 20 14:42:27 crc kubenswrapper[4895]: E0320 14:42:27.007680 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" containerName="container-00" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007685 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" containerName="container-00" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007905 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c473dbb3-fb53-42a4-b67e-92cbc732f3ca" containerName="registry-server" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007923 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" containerName="container-00" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.007939 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f07f4f-3d46-49e1-87d9-177e65622278" containerName="oc" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.008623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.138231 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkf2x\" (UniqueName: \"kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.138572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.232569 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0308bdf2-0a20-4a43-a1cf-58b570393bc1" path="/var/lib/kubelet/pods/0308bdf2-0a20-4a43-a1cf-58b570393bc1/volumes" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.243900 4895 scope.go:117] "RemoveContainer" containerID="67703a66ac7a148ac9d0c3cb078b4ee94df93ff47c8ee82094ae197ba61e8693" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.254936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkf2x\" (UniqueName: \"kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.255044 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.256564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.311363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkf2x\" (UniqueName: \"kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x\") pod \"crc-debug-wtrrn\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:27 crc kubenswrapper[4895]: I0320 14:42:27.334563 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:28 crc kubenswrapper[4895]: I0320 14:42:28.372882 4895 generic.go:334] "Generic (PLEG): container finished" podID="5f8fd3c3-4e0c-4818-86ac-e3a77d087767" containerID="ac269453e156e5d0eceff789b3f8b9cfec1675b772fe2543013a9ef7672cb7ad" exitCode=0 Mar 20 14:42:28 crc kubenswrapper[4895]: I0320 14:42:28.372962 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" event={"ID":"5f8fd3c3-4e0c-4818-86ac-e3a77d087767","Type":"ContainerDied","Data":"ac269453e156e5d0eceff789b3f8b9cfec1675b772fe2543013a9ef7672cb7ad"} Mar 20 14:42:28 crc kubenswrapper[4895]: I0320 14:42:28.373155 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" event={"ID":"5f8fd3c3-4e0c-4818-86ac-e3a77d087767","Type":"ContainerStarted","Data":"d97967c0295b6623ec154dea3ae503cf46c1a236bd597465ffb657a5e49795e2"} Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.516319 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.610187 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-wtrrn"] Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.614556 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkf2x\" (UniqueName: \"kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x\") pod \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.614687 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host\") pod \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\" (UID: \"5f8fd3c3-4e0c-4818-86ac-e3a77d087767\") " Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.615404 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host" (OuterVolumeSpecName: "host") pod "5f8fd3c3-4e0c-4818-86ac-e3a77d087767" (UID: "5f8fd3c3-4e0c-4818-86ac-e3a77d087767"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.622519 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x" (OuterVolumeSpecName: "kube-api-access-kkf2x") pod "5f8fd3c3-4e0c-4818-86ac-e3a77d087767" (UID: "5f8fd3c3-4e0c-4818-86ac-e3a77d087767"). InnerVolumeSpecName "kube-api-access-kkf2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.626347 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-wtrrn"] Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.717998 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkf2x\" (UniqueName: \"kubernetes.io/projected/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-kube-api-access-kkf2x\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:29 crc kubenswrapper[4895]: I0320 14:42:29.718037 4895 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f8fd3c3-4e0c-4818-86ac-e3a77d087767-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.418802 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97967c0295b6623ec154dea3ae503cf46c1a236bd597465ffb657a5e49795e2" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.419045 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-wtrrn" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.979829 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-7vnzk"] Mar 20 14:42:30 crc kubenswrapper[4895]: E0320 14:42:30.980246 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8fd3c3-4e0c-4818-86ac-e3a77d087767" containerName="container-00" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.980258 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8fd3c3-4e0c-4818-86ac-e3a77d087767" containerName="container-00" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.980468 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8fd3c3-4e0c-4818-86ac-e3a77d087767" containerName="container-00" Mar 20 14:42:30 crc kubenswrapper[4895]: I0320 14:42:30.981410 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.043532 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ctmn\" (UniqueName: \"kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.043927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.145845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ctmn\" (UniqueName: \"kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.146052 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.146253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.164540 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ctmn\" (UniqueName: \"kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn\") pod \"crc-debug-7vnzk\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.266918 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8fd3c3-4e0c-4818-86ac-e3a77d087767" path="/var/lib/kubelet/pods/5f8fd3c3-4e0c-4818-86ac-e3a77d087767/volumes" Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.302049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:31 crc kubenswrapper[4895]: W0320 14:42:31.404407 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99e5761d_cb35_49cf_b0c7_c7faf41afa87.slice/crio-79e3b0a1dd0a1efffba5bb81112d675818bd4fac19467c233636221c6d25e6d1 WatchSource:0}: Error finding container 79e3b0a1dd0a1efffba5bb81112d675818bd4fac19467c233636221c6d25e6d1: Status 404 returned error can't find the container with id 79e3b0a1dd0a1efffba5bb81112d675818bd4fac19467c233636221c6d25e6d1 Mar 20 14:42:31 crc kubenswrapper[4895]: I0320 14:42:31.436113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" event={"ID":"99e5761d-cb35-49cf-b0c7-c7faf41afa87","Type":"ContainerStarted","Data":"79e3b0a1dd0a1efffba5bb81112d675818bd4fac19467c233636221c6d25e6d1"} Mar 20 14:42:32 crc kubenswrapper[4895]: I0320 14:42:32.447946 4895 generic.go:334] "Generic (PLEG): container finished" podID="99e5761d-cb35-49cf-b0c7-c7faf41afa87" containerID="da011d62dcc72a6223f1797a89eeb2a85ea5c11b6db0d4a299e7e0aac4ea48f7" exitCode=0 Mar 20 14:42:32 crc kubenswrapper[4895]: I0320 14:42:32.448022 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" event={"ID":"99e5761d-cb35-49cf-b0c7-c7faf41afa87","Type":"ContainerDied","Data":"da011d62dcc72a6223f1797a89eeb2a85ea5c11b6db0d4a299e7e0aac4ea48f7"} Mar 20 14:42:32 crc kubenswrapper[4895]: I0320 14:42:32.494920 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-7vnzk"] Mar 20 14:42:32 crc kubenswrapper[4895]: I0320 14:42:32.516343 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rrdnn/crc-debug-7vnzk"] Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.587169 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.695673 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host\") pod \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.695933 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ctmn\" (UniqueName: \"kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn\") pod \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\" (UID: \"99e5761d-cb35-49cf-b0c7-c7faf41afa87\") " Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.695832 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host" (OuterVolumeSpecName: "host") pod "99e5761d-cb35-49cf-b0c7-c7faf41afa87" (UID: "99e5761d-cb35-49cf-b0c7-c7faf41afa87"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.705873 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn" (OuterVolumeSpecName: "kube-api-access-8ctmn") pod "99e5761d-cb35-49cf-b0c7-c7faf41afa87" (UID: "99e5761d-cb35-49cf-b0c7-c7faf41afa87"). InnerVolumeSpecName "kube-api-access-8ctmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.798658 4895 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e5761d-cb35-49cf-b0c7-c7faf41afa87-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:33 crc kubenswrapper[4895]: I0320 14:42:33.798700 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ctmn\" (UniqueName: \"kubernetes.io/projected/99e5761d-cb35-49cf-b0c7-c7faf41afa87-kube-api-access-8ctmn\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:34 crc kubenswrapper[4895]: I0320 14:42:34.478347 4895 scope.go:117] "RemoveContainer" containerID="da011d62dcc72a6223f1797a89eeb2a85ea5c11b6db0d4a299e7e0aac4ea48f7" Mar 20 14:42:34 crc kubenswrapper[4895]: I0320 14:42:34.478530 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/crc-debug-7vnzk" Mar 20 14:42:35 crc kubenswrapper[4895]: I0320 14:42:35.225092 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e5761d-cb35-49cf-b0c7-c7faf41afa87" path="/var/lib/kubelet/pods/99e5761d-cb35-49cf-b0c7-c7faf41afa87/volumes" Mar 20 14:42:52 crc kubenswrapper[4895]: I0320 14:42:52.297091 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:42:52 crc kubenswrapper[4895]: I0320 14:42:52.297699 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.297459 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.299021 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.299170 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.300180 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.300376 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24" gracePeriod=600 Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.992682 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24" exitCode=0 Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.992732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24"} Mar 20 14:43:22 crc kubenswrapper[4895]: I0320 14:43:22.992800 4895 scope.go:117] "RemoveContainer" containerID="132193396fb8ce986b258e8f132aa1fa14dad493b6653552dccaa3fc98975d07" Mar 20 14:43:24 crc kubenswrapper[4895]: I0320 14:43:24.006740 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5"} Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.093669 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d413e49a-6f03-44fc-87bf-f6b71efac9ad/init-config-reloader/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.385748 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d413e49a-6f03-44fc-87bf-f6b71efac9ad/init-config-reloader/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.472164 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d413e49a-6f03-44fc-87bf-f6b71efac9ad/config-reloader/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.488907 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d413e49a-6f03-44fc-87bf-f6b71efac9ad/alertmanager/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.727210 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84c5f65f5b-7c6jl_f2ae76b1-0feb-45b7-9e94-063bc0c58ded/barbican-api/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.879738 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84c5f65f5b-7c6jl_f2ae76b1-0feb-45b7-9e94-063bc0c58ded/barbican-api-log/0.log" Mar 20 14:43:45 crc kubenswrapper[4895]: I0320 14:43:45.975900 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7d445cd4-s2zjm_ebaa89fb-ad42-4038-a2fa-cbc9d2711354/barbican-keystone-listener/0.log" Mar 20 14:43:46 crc kubenswrapper[4895]: I0320 14:43:46.210154 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b7d445cd4-s2zjm_ebaa89fb-ad42-4038-a2fa-cbc9d2711354/barbican-keystone-listener-log/0.log" Mar 20 14:43:46 crc kubenswrapper[4895]: I0320 14:43:46.343520 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56f588c54c-qdk5g_443e18a5-4a5b-4678-8a19-dca8434a8a31/barbican-worker-log/0.log" Mar 20 14:43:46 crc kubenswrapper[4895]: I0320 14:43:46.454880 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56f588c54c-qdk5g_443e18a5-4a5b-4678-8a19-dca8434a8a31/barbican-worker/0.log" Mar 20 14:43:46 crc kubenswrapper[4895]: I0320 14:43:46.664609 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jx6lx_80853d34-f97d-49e6-b582-3408214efe70/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:46 crc kubenswrapper[4895]: I0320 14:43:46.934168 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1952c8e8-d8db-4bf4-81b5-57be48de5cbc/ceilometer-central-agent/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.110914 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1952c8e8-d8db-4bf4-81b5-57be48de5cbc/sg-core/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.262303 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1952c8e8-d8db-4bf4-81b5-57be48de5cbc/ceilometer-notification-agent/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.315314 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1952c8e8-d8db-4bf4-81b5-57be48de5cbc/proxy-httpd/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.564556 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f497df96-267d-4b80-8b6b-01fbd8a6477c/cinder-api/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.640315 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f497df96-267d-4b80-8b6b-01fbd8a6477c/cinder-api-log/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.783254 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7c1547aa-5cfe-4a70-bd45-0d06101f0e74/cinder-scheduler/0.log" Mar 20 14:43:47 crc kubenswrapper[4895]: I0320 14:43:47.993974 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7c1547aa-5cfe-4a70-bd45-0d06101f0e74/probe/0.log" Mar 20 14:43:48 crc kubenswrapper[4895]: I0320 14:43:48.293313 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8172bdb2-c101-4267-b041-46af02229c2c/cloudkitty-api/0.log" Mar 20 14:43:48 crc kubenswrapper[4895]: I0320 14:43:48.800931 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8172bdb2-c101-4267-b041-46af02229c2c/cloudkitty-api-log/0.log" Mar 20 14:43:49 crc kubenswrapper[4895]: I0320 14:43:49.122139 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_29bbaba0-0b8d-43f1-9863-dd93ef7f8e3e/loki-compactor/0.log" Mar 20 14:43:49 crc kubenswrapper[4895]: I0320 14:43:49.198221 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-5d547bbd4d-5tkzk_b4cd9c2d-3b16-4152-9269-263b91fa4769/loki-distributor/0.log" Mar 20 14:43:49 crc kubenswrapper[4895]: I0320 14:43:49.344758 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-82dcg_9eda3cc0-3576-46cb-8da1-12ca651af767/gateway/0.log" Mar 20 14:43:49 crc kubenswrapper[4895]: I0320 14:43:49.670890 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-n2t45_faa3805b-edc0-4e1a-91e5-05667f94e119/gateway/0.log" Mar 20 14:43:50 crc kubenswrapper[4895]: I0320 14:43:50.124246 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_2c72a116-103e-4be6-91c2-65168b4d456e/loki-index-gateway/0.log" Mar 20 14:43:50 crc kubenswrapper[4895]: I0320 14:43:50.494566 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_27c73d65-3dcb-44cb-a61e-004919dda8b4/loki-ingester/0.log" Mar 20 14:43:50 crc kubenswrapper[4895]: I0320 14:43:50.976200 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-6f54889599-h8n6z_97b1a9d8-e379-4fe0-9036-3c05e9620b4a/loki-query-frontend/0.log" Mar 20 14:43:51 crc kubenswrapper[4895]: I0320 14:43:51.630042 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cjfwm_f741586e-ce78-4057-8e0c-032310d4e3a4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:52 crc kubenswrapper[4895]: I0320 14:43:52.246859 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-668f98fdd7-ltb4d_384ff1a6-c0b2-4b58-aac3-e847f789de25/loki-querier/0.log" Mar 20 14:43:52 crc kubenswrapper[4895]: I0320 14:43:52.370097 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-l6lbm_77679ba3-7833-453d-b008-536582648587/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:52 crc kubenswrapper[4895]: I0320 14:43:52.751776 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-p6pm5_2bdf219a-4e3a-448d-9624-bc31e07f1ad2/init/0.log" Mar 20 14:43:52 crc kubenswrapper[4895]: I0320 14:43:52.972632 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-p6pm5_2bdf219a-4e3a-448d-9624-bc31e07f1ad2/init/0.log" Mar 20 14:43:53 crc kubenswrapper[4895]: I0320 14:43:53.261974 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c4b758ff5-p6pm5_2bdf219a-4e3a-448d-9624-bc31e07f1ad2/dnsmasq-dns/0.log" Mar 20 14:43:53 crc kubenswrapper[4895]: I0320 14:43:53.389131 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7l89w_d4ba85e6-8f8d-4f5e-9e05-48690b7da983/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:53 crc kubenswrapper[4895]: I0320 14:43:53.595189 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cf90b0b3-45b3-4926-bf71-703b79d4cae4/glance-log/0.log" Mar 20 14:43:53 crc kubenswrapper[4895]: I0320 14:43:53.602014 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cf90b0b3-45b3-4926-bf71-703b79d4cae4/glance-httpd/0.log" Mar 20 14:43:54 crc kubenswrapper[4895]: I0320 14:43:54.041110 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_06d364d4-5809-40d8-8e14-11ae873d4c47/glance-log/0.log" Mar 20 14:43:54 crc kubenswrapper[4895]: I0320 14:43:54.163550 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_06d364d4-5809-40d8-8e14-11ae873d4c47/glance-httpd/0.log" Mar 20 14:43:54 crc kubenswrapper[4895]: I0320 14:43:54.397970 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_4838ae67-efa2-48a2-86e7-1cb231be8eed/cloudkitty-proc/0.log" Mar 20 14:43:54 crc kubenswrapper[4895]: I0320 14:43:54.648054 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-m6xpj_3edac150-5a84-4c67-8999-f0161dc784ba/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:54 crc kubenswrapper[4895]: I0320 14:43:54.705173 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wkfk4_19650c9a-aeda-44ce-9793-a3b03e1d361d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:55 crc kubenswrapper[4895]: I0320 14:43:55.049689 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-65558fd5f5-5tmzj_54aa85fa-25cb-409a-be60-4c0cb8468466/keystone-api/0.log" Mar 20 14:43:55 crc kubenswrapper[4895]: I0320 14:43:55.138028 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566921-87zd9_f4cdaa4c-2563-4f80-924b-33f19fd8a099/keystone-cron/0.log" Mar 20 14:43:55 crc kubenswrapper[4895]: I0320 14:43:55.273902 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c239ab6f-e370-422d-8af1-dff391b88461/kube-state-metrics/0.log" Mar 20 14:43:56 crc kubenswrapper[4895]: I0320 14:43:56.418382 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56bf665d85-xzq8s_8eecbe8c-a839-4641-b617-921265cd8f14/neutron-httpd/0.log" Mar 20 14:43:56 crc kubenswrapper[4895]: I0320 14:43:56.453655 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56bf665d85-xzq8s_8eecbe8c-a839-4641-b617-921265cd8f14/neutron-api/0.log" Mar 20 14:43:56 crc kubenswrapper[4895]: I0320 14:43:56.603123 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-sb4wg_584873be-8282-406d-9a6a-2abb61f6d3bd/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:56 crc kubenswrapper[4895]: I0320 14:43:56.772516 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jvgd5_dc06d8c6-f0e5-4555-84d0-e67d1a358e18/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:57 crc kubenswrapper[4895]: I0320 14:43:57.599386 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6013cf3c-ce92-4c95-b649-5a7d05f4e1fd/nova-api-api/0.log" Mar 20 14:43:57 crc kubenswrapper[4895]: I0320 14:43:57.782862 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6013cf3c-ce92-4c95-b649-5a7d05f4e1fd/nova-api-log/0.log" Mar 20 14:43:58 crc kubenswrapper[4895]: I0320 14:43:58.057048 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4b4528c7-f8b8-4f3c-b86b-a803fee7d982/nova-cell0-conductor-conductor/0.log" Mar 20 14:43:58 crc kubenswrapper[4895]: I0320 14:43:58.217654 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_58722a98-11a8-4e98-8185-82f18acd6718/nova-cell1-conductor-conductor/0.log" Mar 20 14:43:58 crc kubenswrapper[4895]: I0320 14:43:58.873760 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_259d9241-bf26-46fe-85ea-8ce9efdf0821/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 14:43:59 crc kubenswrapper[4895]: I0320 14:43:59.433969 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5v4cr_dabc07a2-735b-409d-826e-9f4877cc40fe/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:43:59 crc kubenswrapper[4895]: I0320 14:43:59.626301 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_155eaf40-0b01-4dba-af34-0fce0b907680/nova-metadata-log/0.log" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.170455 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hlw62"] Mar 20 14:44:00 crc kubenswrapper[4895]: E0320 14:44:00.170943 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e5761d-cb35-49cf-b0c7-c7faf41afa87" containerName="container-00" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.170956 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e5761d-cb35-49cf-b0c7-c7faf41afa87" containerName="container-00" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.171165 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e5761d-cb35-49cf-b0c7-c7faf41afa87" containerName="container-00" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.172004 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.183886 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.184111 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.184246 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.224352 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hlw62"] Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.263007 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c851b618-6bf5-4291-ae40-20ed962dfe46/mysql-bootstrap/0.log" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.341124 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d195aa63-ddd2-44d4-b7ec-fc6761422619/nova-scheduler-scheduler/0.log" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.347672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4kl\" (UniqueName: \"kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl\") pod \"auto-csr-approver-29566964-hlw62\" (UID: \"7fe1df06-4d13-45e2-898d-deae9f895a68\") " pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.433049 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_155eaf40-0b01-4dba-af34-0fce0b907680/nova-metadata-metadata/0.log" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.452488 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4kl\" (UniqueName: \"kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl\") pod \"auto-csr-approver-29566964-hlw62\" (UID: \"7fe1df06-4d13-45e2-898d-deae9f895a68\") " pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.481312 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4kl\" (UniqueName: \"kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl\") pod \"auto-csr-approver-29566964-hlw62\" (UID: \"7fe1df06-4d13-45e2-898d-deae9f895a68\") " pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.525876 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:00 crc kubenswrapper[4895]: I0320 14:44:00.929289 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c851b618-6bf5-4291-ae40-20ed962dfe46/galera/0.log" Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.136697 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c851b618-6bf5-4291-ae40-20ed962dfe46/mysql-bootstrap/0.log" Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.183894 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6b3c4f62-dc8a-49bd-b97e-d57133678e19/mysql-bootstrap/0.log" Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.336525 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.380936 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hlw62"] Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.441756 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hlw62" event={"ID":"7fe1df06-4d13-45e2-898d-deae9f895a68","Type":"ContainerStarted","Data":"9ae35abaafef4dc3ed38c6f93be3c32524abae00629752efd9e64d61d7318442"} Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.616491 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6b3c4f62-dc8a-49bd-b97e-d57133678e19/galera/0.log" Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.644099 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6b3c4f62-dc8a-49bd-b97e-d57133678e19/mysql-bootstrap/0.log" Mar 20 14:44:01 crc kubenswrapper[4895]: I0320 14:44:01.793353 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c52a1a0f-5544-4b98-8746-4bb3d7066c87/openstackclient/0.log" Mar 20 14:44:02 crc kubenswrapper[4895]: I0320 14:44:02.108855 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4phvm_f0db633f-39ca-4915-ab69-a17d9140e31b/ovn-controller/0.log" Mar 20 14:44:02 crc kubenswrapper[4895]: I0320 14:44:02.191196 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-46hcc_2be1f9cb-60c7-4ffd-8ac2-d7e47df959d2/openstack-network-exporter/0.log" Mar 20 14:44:02 crc kubenswrapper[4895]: I0320 14:44:02.601217 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvskb_ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9/ovsdb-server-init/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.097272 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvskb_ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9/ovs-vswitchd/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.360111 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvskb_ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9/ovsdb-server/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.361342 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mvskb_ea15e08a-0dc3-4b15-a90a-e06ae11a2ac9/ovsdb-server-init/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.471877 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hlw62" event={"ID":"7fe1df06-4d13-45e2-898d-deae9f895a68","Type":"ContainerStarted","Data":"2a57fb5b93e65149b21a8138232395c0061ed5560da7ea565817de27b71d7f52"} Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.514005 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566964-hlw62" podStartSLOduration=2.28669118 podStartE2EDuration="3.513988736s" podCreationTimestamp="2026-03-20 14:44:00 +0000 UTC" firstStartedPulling="2026-03-20 14:44:01.336199689 +0000 UTC m=+4940.845918655" lastFinishedPulling="2026-03-20 14:44:02.563497245 +0000 UTC m=+4942.073216211" observedRunningTime="2026-03-20 14:44:03.498916537 +0000 UTC m=+4943.008635503" watchObservedRunningTime="2026-03-20 14:44:03.513988736 +0000 UTC m=+4943.023707702" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.819862 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-z9xg4_d67ed2a1-5121-41c0-a5d9-3962837f0cb2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.880017 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_676abdd9-331c-4d23-817d-a608c366a737/ovn-northd/0.log" Mar 20 14:44:03 crc kubenswrapper[4895]: I0320 14:44:03.880350 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_676abdd9-331c-4d23-817d-a608c366a737/openstack-network-exporter/0.log" Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.218739 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af691a5e-1267-46ec-9d39-f4fa047a1741/ovsdbserver-nb/0.log" Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.310638 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_af691a5e-1267-46ec-9d39-f4fa047a1741/openstack-network-exporter/0.log" Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.498152 4895 generic.go:334] "Generic (PLEG): container finished" podID="7fe1df06-4d13-45e2-898d-deae9f895a68" containerID="2a57fb5b93e65149b21a8138232395c0061ed5560da7ea565817de27b71d7f52" exitCode=0 Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.498846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hlw62" event={"ID":"7fe1df06-4d13-45e2-898d-deae9f895a68","Type":"ContainerDied","Data":"2a57fb5b93e65149b21a8138232395c0061ed5560da7ea565817de27b71d7f52"} Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.606458 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5761f186-a7a3-4ce2-8ed9-bcea12b186c8/openstack-network-exporter/0.log" Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.734293 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5761f186-a7a3-4ce2-8ed9-bcea12b186c8/ovsdbserver-sb/0.log" Mar 20 14:44:04 crc kubenswrapper[4895]: I0320 14:44:04.922251 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b99d76fbb-c92mx_46abdc7f-1f99-44dc-8cc1-3a7c61186946/placement-api/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.096477 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b99d76fbb-c92mx_46abdc7f-1f99-44dc-8cc1-3a7c61186946/placement-log/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.359787 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_be90380e-db54-4216-8972-507d8c538e4b/init-config-reloader/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.769241 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_be90380e-db54-4216-8972-507d8c538e4b/config-reloader/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.805908 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_be90380e-db54-4216-8972-507d8c538e4b/prometheus/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.816036 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_be90380e-db54-4216-8972-507d8c538e4b/init-config-reloader/0.log" Mar 20 14:44:05 crc kubenswrapper[4895]: I0320 14:44:05.887850 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_be90380e-db54-4216-8972-507d8c538e4b/thanos-sidecar/0.log" Mar 20 14:44:06 crc kubenswrapper[4895]: I0320 14:44:06.115012 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa162ed3-a588-406c-a81e-5aafc5a82d05/setup-container/0.log" Mar 20 14:44:06 crc kubenswrapper[4895]: I0320 14:44:06.618351 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa162ed3-a588-406c-a81e-5aafc5a82d05/rabbitmq/0.log" Mar 20 14:44:06 crc kubenswrapper[4895]: I0320 14:44:06.757009 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa162ed3-a588-406c-a81e-5aafc5a82d05/setup-container/0.log" Mar 20 14:44:06 crc kubenswrapper[4895]: I0320 14:44:06.802308 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a6f84dd-56f2-4594-a3a0-bd428f57c6be/setup-container/0.log" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.204538 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.351481 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk4kl\" (UniqueName: \"kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl\") pod \"7fe1df06-4d13-45e2-898d-deae9f895a68\" (UID: \"7fe1df06-4d13-45e2-898d-deae9f895a68\") " Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.360175 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl" (OuterVolumeSpecName: "kube-api-access-vk4kl") pod "7fe1df06-4d13-45e2-898d-deae9f895a68" (UID: "7fe1df06-4d13-45e2-898d-deae9f895a68"). InnerVolumeSpecName "kube-api-access-vk4kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.453965 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk4kl\" (UniqueName: \"kubernetes.io/projected/7fe1df06-4d13-45e2-898d-deae9f895a68-kube-api-access-vk4kl\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.527259 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hlw62" event={"ID":"7fe1df06-4d13-45e2-898d-deae9f895a68","Type":"ContainerDied","Data":"9ae35abaafef4dc3ed38c6f93be3c32524abae00629752efd9e64d61d7318442"} Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.527304 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae35abaafef4dc3ed38c6f93be3c32524abae00629752efd9e64d61d7318442" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.527353 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hlw62" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.589494 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a6f84dd-56f2-4594-a3a0-bd428f57c6be/rabbitmq/0.log" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.642910 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a6f84dd-56f2-4594-a3a0-bd428f57c6be/setup-container/0.log" Mar 20 14:44:07 crc kubenswrapper[4895]: I0320 14:44:07.652498 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hgnsb_1a2a0abe-d614-4f65-b832-06b9ddbdef54/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:08 crc kubenswrapper[4895]: I0320 14:44:08.069627 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hhftb_389115ab-14fd-4b1e-96a3-33453ff90899/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:08 crc kubenswrapper[4895]: I0320 14:44:08.206460 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cq7kj_57227803-046a-4dd7-8f7f-c93f09f2ab4c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:08 crc kubenswrapper[4895]: I0320 14:44:08.290583 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-8r855"] Mar 20 14:44:08 crc kubenswrapper[4895]: I0320 14:44:08.302888 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-8r855"] Mar 20 14:44:08 crc kubenswrapper[4895]: I0320 14:44:08.711437 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-txfsl_9fde50f5-9c7a-4737-9d42-f6df58df9629/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.238827 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1211640-b38a-4c4f-883f-da719892cd75" path="/var/lib/kubelet/pods/e1211640-b38a-4c4f-883f-da719892cd75/volumes" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.344039 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jhd4n_d18f4c75-cf01-4b82-844a-f24b83ddfb7a/ssh-known-hosts-edpm-deployment/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.531689 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d4b947447-mjwnr_055085bc-2288-49cd-b07f-28747f5a6458/proxy-server/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.541666 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7c64e6c1-1601-4c6d-9cfe-2287e9147576/memcached/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.635899 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d4b947447-mjwnr_055085bc-2288-49cd-b07f-28747f5a6458/proxy-httpd/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.724658 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2x97m_ef2bf2f7-0cd4-4c17-8e27-a5c250fe761a/swift-ring-rebalance/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.877738 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/account-auditor/0.log" Mar 20 14:44:09 crc kubenswrapper[4895]: I0320 14:44:09.999609 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/account-reaper/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.026899 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/account-replicator/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.214340 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/account-server/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.225766 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/container-auditor/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.343287 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/container-replicator/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.470730 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/container-server/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.652195 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/object-auditor/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.681976 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/object-replicator/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.758871 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/container-updater/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.817543 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/object-expirer/0.log" Mar 20 14:44:10 crc kubenswrapper[4895]: I0320 14:44:10.889660 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/object-server/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.091112 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/rsync/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.096985 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/object-updater/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.163562 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a1dc57ca-aca1-4886-ba82-f2f4b73944a1/swift-recon-cron/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.559480 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-fmfhv_a15bb8dc-2a80-48a7-aa1b-6b0bc8103525/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.612144 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2acaa768-7497-437a-bd7d-46308eb5e0e2/tempest-tests-tempest-tests-runner/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.884234 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_54ae286f-9132-4ea8-bdfa-3db69f52a13b/test-operator-logs-container/0.log" Mar 20 14:44:11 crc kubenswrapper[4895]: I0320 14:44:11.916546 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-hsvxq_7d36a84a-b329-40b5-8da0-4a01ff417cc4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:44:27 crc kubenswrapper[4895]: I0320 14:44:27.535868 4895 scope.go:117] "RemoveContainer" containerID="747d4815c8649671f611e5e1f7a1158660273dcd81b036577fd9d7f6b909464f" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.163973 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5"] Mar 20 14:45:00 crc kubenswrapper[4895]: E0320 14:45:00.165131 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe1df06-4d13-45e2-898d-deae9f895a68" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.165149 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe1df06-4d13-45e2-898d-deae9f895a68" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.165377 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe1df06-4d13-45e2-898d-deae9f895a68" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.166343 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.169117 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.176499 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.177500 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5"] Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.311805 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lhp\" (UniqueName: \"kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.311858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.312104 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.413819 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lhp\" (UniqueName: \"kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.413877 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.413946 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.415059 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.424764 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.435100 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lhp\" (UniqueName: \"kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp\") pod \"collect-profiles-29566965-vt5j5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:00 crc kubenswrapper[4895]: I0320 14:45:00.485851 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:01 crc kubenswrapper[4895]: I0320 14:45:01.281187 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5"] Mar 20 14:45:02 crc kubenswrapper[4895]: I0320 14:45:02.099725 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" event={"ID":"3994688e-c5b6-45ff-a71f-f8207ed5e3b5","Type":"ContainerStarted","Data":"b78267bca043486855393c0cc5d78d137feb4b09f1bc029cd83a6867982c3ce4"} Mar 20 14:45:02 crc kubenswrapper[4895]: I0320 14:45:02.099991 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" event={"ID":"3994688e-c5b6-45ff-a71f-f8207ed5e3b5","Type":"ContainerStarted","Data":"4ee1de5ea9df74192ca22e472a53fc8b195572581dfbae4b1aec7163a928132d"} Mar 20 14:45:02 crc kubenswrapper[4895]: I0320 14:45:02.127249 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" podStartSLOduration=2.12723199 podStartE2EDuration="2.12723199s" podCreationTimestamp="2026-03-20 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:02.124068203 +0000 UTC m=+5001.633787169" watchObservedRunningTime="2026-03-20 14:45:02.12723199 +0000 UTC m=+5001.636950956" Mar 20 14:45:02 crc kubenswrapper[4895]: E0320 14:45:02.226469 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3994688e_c5b6_45ff_a71f_f8207ed5e3b5.slice/crio-b78267bca043486855393c0cc5d78d137feb4b09f1bc029cd83a6867982c3ce4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3994688e_c5b6_45ff_a71f_f8207ed5e3b5.slice/crio-conmon-b78267bca043486855393c0cc5d78d137feb4b09f1bc029cd83a6867982c3ce4.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:45:03 crc kubenswrapper[4895]: I0320 14:45:03.112495 4895 generic.go:334] "Generic (PLEG): container finished" podID="3994688e-c5b6-45ff-a71f-f8207ed5e3b5" containerID="b78267bca043486855393c0cc5d78d137feb4b09f1bc029cd83a6867982c3ce4" exitCode=0 Mar 20 14:45:03 crc kubenswrapper[4895]: I0320 14:45:03.112564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" event={"ID":"3994688e-c5b6-45ff-a71f-f8207ed5e3b5","Type":"ContainerDied","Data":"b78267bca043486855393c0cc5d78d137feb4b09f1bc029cd83a6867982c3ce4"} Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.376668 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.535593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lhp\" (UniqueName: \"kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp\") pod \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.536020 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume\") pod \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.536086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume\") pod \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\" (UID: \"3994688e-c5b6-45ff-a71f-f8207ed5e3b5\") " Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.536734 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "3994688e-c5b6-45ff-a71f-f8207ed5e3b5" (UID: "3994688e-c5b6-45ff-a71f-f8207ed5e3b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.638946 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.932626 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp" (OuterVolumeSpecName: "kube-api-access-m5lhp") pod "3994688e-c5b6-45ff-a71f-f8207ed5e3b5" (UID: "3994688e-c5b6-45ff-a71f-f8207ed5e3b5"). InnerVolumeSpecName "kube-api-access-m5lhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.945611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3994688e-c5b6-45ff-a71f-f8207ed5e3b5" (UID: "3994688e-c5b6-45ff-a71f-f8207ed5e3b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.946782 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5lhp\" (UniqueName: \"kubernetes.io/projected/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-kube-api-access-m5lhp\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:05 crc kubenswrapper[4895]: I0320 14:45:05.946819 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3994688e-c5b6-45ff-a71f-f8207ed5e3b5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:06 crc kubenswrapper[4895]: I0320 14:45:06.143854 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" event={"ID":"3994688e-c5b6-45ff-a71f-f8207ed5e3b5","Type":"ContainerDied","Data":"4ee1de5ea9df74192ca22e472a53fc8b195572581dfbae4b1aec7163a928132d"} Mar 20 14:45:06 crc kubenswrapper[4895]: I0320 14:45:06.143902 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee1de5ea9df74192ca22e472a53fc8b195572581dfbae4b1aec7163a928132d" Mar 20 14:45:06 crc kubenswrapper[4895]: I0320 14:45:06.143969 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-vt5j5" Mar 20 14:45:06 crc kubenswrapper[4895]: I0320 14:45:06.457055 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x"] Mar 20 14:45:06 crc kubenswrapper[4895]: I0320 14:45:06.475223 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-v4q6x"] Mar 20 14:45:07 crc kubenswrapper[4895]: I0320 14:45:07.222796 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de4d5d0-e193-4c77-b93d-bb677e3cfc7a" path="/var/lib/kubelet/pods/3de4d5d0-e193-4c77-b93d-bb677e3cfc7a/volumes" Mar 20 14:45:09 crc kubenswrapper[4895]: I0320 14:45:09.847413 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/util/0.log" Mar 20 14:45:10 crc kubenswrapper[4895]: I0320 14:45:10.092555 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/util/0.log" Mar 20 14:45:10 crc kubenswrapper[4895]: I0320 14:45:10.291024 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/pull/0.log" Mar 20 14:45:10 crc kubenswrapper[4895]: I0320 14:45:10.451915 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/pull/0.log" Mar 20 14:45:11 crc kubenswrapper[4895]: I0320 14:45:11.000618 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/util/0.log" Mar 20 14:45:11 crc kubenswrapper[4895]: I0320 14:45:11.142179 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/extract/0.log" Mar 20 14:45:11 crc kubenswrapper[4895]: I0320 14:45:11.192705 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243jzww2_0cbb9e45-1500-4b44-959c-2a9b4f0a587c/pull/0.log" Mar 20 14:45:11 crc kubenswrapper[4895]: I0320 14:45:11.458428 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-h6qcp_45950a96-a521-4429-b7d1-71efa644a087/manager/0.log" Mar 20 14:45:11 crc kubenswrapper[4895]: I0320 14:45:11.885203 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-6rrsx_8ec4bb36-473c-4103-bfeb-10e8df206b9a/manager/0.log" Mar 20 14:45:12 crc kubenswrapper[4895]: I0320 14:45:12.373503 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-vvwnk_e59747be-3214-43b9-b75b-88b8e7e71484/manager/0.log" Mar 20 14:45:12 crc kubenswrapper[4895]: I0320 14:45:12.759212 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-f4sjs_90753829-7cac-4f8f-8aa5-086430d0eafa/manager/0.log" Mar 20 14:45:12 crc kubenswrapper[4895]: I0320 14:45:12.787696 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-x55wz_0039adb6-7c13-414b-bbd6-25e759da85b7/manager/0.log" Mar 20 14:45:13 crc kubenswrapper[4895]: I0320 14:45:13.105078 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-8tttj_810c2ef6-f5e6-4003-b01e-76e1edbbe452/manager/0.log" Mar 20 14:45:13 crc kubenswrapper[4895]: I0320 14:45:13.555502 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-dnmhw_e386d39f-7654-4d1d-84fc-6796309ac427/manager/0.log" Mar 20 14:45:13 crc kubenswrapper[4895]: I0320 14:45:13.829557 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-pk85v_4c144b6f-b36e-442a-8aa8-8ffa93bf9eaa/manager/0.log" Mar 20 14:45:14 crc kubenswrapper[4895]: I0320 14:45:14.049631 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-nsh2d_9d9feeae-ff51-432c-a4a4-e375d743f0b3/manager/0.log" Mar 20 14:45:14 crc kubenswrapper[4895]: I0320 14:45:14.204818 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-bqdrg_7d7e8ef8-065c-40c0-a396-915b7efdd1a0/manager/0.log" Mar 20 14:45:14 crc kubenswrapper[4895]: I0320 14:45:14.706844 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xvkkg_de58ceb7-b3dd-487f-95eb-48d02a0accc3/manager/0.log" Mar 20 14:45:14 crc kubenswrapper[4895]: I0320 14:45:14.747798 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-c2kgq_0d066703-200f-472a-b768-f6aef5eb347f/manager/0.log" Mar 20 14:45:14 crc kubenswrapper[4895]: I0320 14:45:14.973035 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-7gt5d_7a69fbe4-c8ec-4914-b93e-3d234e7c1a9c/manager/0.log" Mar 20 14:45:15 crc kubenswrapper[4895]: I0320 14:45:15.133008 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-8nddx_73e2c644-bdbd-4769-946a-4e2111a28326/manager/0.log" Mar 20 14:45:15 crc kubenswrapper[4895]: I0320 14:45:15.360654 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-cm5d4_de5694d4-a796-46ee-9f84-4b9d35475f27/manager/0.log" Mar 20 14:45:15 crc kubenswrapper[4895]: I0320 14:45:15.533913 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f7459b8bf-lvm5m_d45d9a5f-9ee2-494a-9c05-5fd7cc094da4/operator/0.log" Mar 20 14:45:15 crc kubenswrapper[4895]: I0320 14:45:15.864901 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cgtq6_94b37e00-b08d-4f87-8e18-d758b22a4079/registry-server/0.log" Mar 20 14:45:15 crc kubenswrapper[4895]: I0320 14:45:15.952504 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-br56m_f8f4d668-e8ad-4c6c-9107-6221569d3079/manager/0.log" Mar 20 14:45:16 crc kubenswrapper[4895]: I0320 14:45:16.460050 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-c2cz2_43459d05-1aac-46b1-b690-1b8c948bbb07/operator/0.log" Mar 20 14:45:16 crc kubenswrapper[4895]: I0320 14:45:16.525476 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-f6m2m_680cd993-89dd-47f4-8555-b49ff8293a76/manager/0.log" Mar 20 14:45:16 crc kubenswrapper[4895]: I0320 14:45:16.873703 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6xwg6_e9c5c274-21be-4e53-99f7-d1ab4f352142/manager/0.log" Mar 20 14:45:16 crc kubenswrapper[4895]: I0320 14:45:16.943416 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-78865ff6b4-c6nbz_27b2849d-9127-4c6b-a83f-a1ce0af6cac8/manager/0.log" Mar 20 14:45:17 crc kubenswrapper[4895]: I0320 14:45:17.333363 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-fbb6f4f4f-rbm9d_4ce923f6-b8eb-4461-a222-0af773470e76/manager/0.log" Mar 20 14:45:17 crc kubenswrapper[4895]: I0320 14:45:17.432187 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-ppltl_cc9f95f5-a6fd-4638-989e-3dff592f5022/manager/0.log" Mar 20 14:45:17 crc kubenswrapper[4895]: I0320 14:45:17.528043 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-8xlbt_1d3b843f-4b33-455f-9d52-6a0267d370cb/manager/0.log" Mar 20 14:45:27 crc kubenswrapper[4895]: I0320 14:45:27.736087 4895 scope.go:117] "RemoveContainer" containerID="fcd71d0b68095e12a5804149affdef1b9ae7dbfb87e9ce139cc8f2122ac8d986" Mar 20 14:45:52 crc kubenswrapper[4895]: I0320 14:45:52.297218 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:45:52 crc kubenswrapper[4895]: I0320 14:45:52.297789 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.165570 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566966-5fsn5"] Mar 20 14:46:00 crc kubenswrapper[4895]: E0320 14:46:00.167164 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3994688e-c5b6-45ff-a71f-f8207ed5e3b5" containerName="collect-profiles" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.167182 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3994688e-c5b6-45ff-a71f-f8207ed5e3b5" containerName="collect-profiles" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.167486 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3994688e-c5b6-45ff-a71f-f8207ed5e3b5" containerName="collect-profiles" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.175189 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.180919 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.182759 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.182998 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.196227 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-5fsn5"] Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.276199 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjvm\" (UniqueName: \"kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm\") pod \"auto-csr-approver-29566966-5fsn5\" (UID: \"a70aff11-9c5c-4e3e-b78d-85edca750405\") " pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.378202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjvm\" (UniqueName: \"kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm\") pod \"auto-csr-approver-29566966-5fsn5\" (UID: \"a70aff11-9c5c-4e3e-b78d-85edca750405\") " pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.406102 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjvm\" (UniqueName: \"kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm\") pod \"auto-csr-approver-29566966-5fsn5\" (UID: \"a70aff11-9c5c-4e3e-b78d-85edca750405\") " pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:00 crc kubenswrapper[4895]: I0320 14:46:00.496330 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:01 crc kubenswrapper[4895]: I0320 14:46:01.269682 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2dq2z_bc0a8d83-a2d4-4231-a024-85e6cf31955c/control-plane-machine-set-operator/0.log" Mar 20 14:46:01 crc kubenswrapper[4895]: I0320 14:46:01.434303 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-5fsn5"] Mar 20 14:46:01 crc kubenswrapper[4895]: I0320 14:46:01.694813 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-px7gz_4005b37f-581a-4651-9dcb-f16414503616/kube-rbac-proxy/0.log" Mar 20 14:46:01 crc kubenswrapper[4895]: I0320 14:46:01.722770 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" event={"ID":"a70aff11-9c5c-4e3e-b78d-85edca750405","Type":"ContainerStarted","Data":"205d975969cad51e2099de2c314a04542ed91fcd41ba937b27ca6239e2c48421"} Mar 20 14:46:01 crc kubenswrapper[4895]: I0320 14:46:01.858418 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-px7gz_4005b37f-581a-4651-9dcb-f16414503616/machine-api-operator/0.log" Mar 20 14:46:03 crc kubenswrapper[4895]: I0320 14:46:03.781545 4895 generic.go:334] "Generic (PLEG): container finished" podID="a70aff11-9c5c-4e3e-b78d-85edca750405" containerID="9b63fa02113b7d01900518a91f87a146bbe3861011d1dfeb8c48940ee894027c" exitCode=0 Mar 20 14:46:03 crc kubenswrapper[4895]: I0320 14:46:03.782518 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" event={"ID":"a70aff11-9c5c-4e3e-b78d-85edca750405","Type":"ContainerDied","Data":"9b63fa02113b7d01900518a91f87a146bbe3861011d1dfeb8c48940ee894027c"} Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.651676 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.769335 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbjvm\" (UniqueName: \"kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm\") pod \"a70aff11-9c5c-4e3e-b78d-85edca750405\" (UID: \"a70aff11-9c5c-4e3e-b78d-85edca750405\") " Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.776639 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm" (OuterVolumeSpecName: "kube-api-access-vbjvm") pod "a70aff11-9c5c-4e3e-b78d-85edca750405" (UID: "a70aff11-9c5c-4e3e-b78d-85edca750405"). InnerVolumeSpecName "kube-api-access-vbjvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.807446 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" event={"ID":"a70aff11-9c5c-4e3e-b78d-85edca750405","Type":"ContainerDied","Data":"205d975969cad51e2099de2c314a04542ed91fcd41ba937b27ca6239e2c48421"} Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.807509 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205d975969cad51e2099de2c314a04542ed91fcd41ba937b27ca6239e2c48421" Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.807582 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-5fsn5" Mar 20 14:46:06 crc kubenswrapper[4895]: I0320 14:46:06.875779 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbjvm\" (UniqueName: \"kubernetes.io/projected/a70aff11-9c5c-4e3e-b78d-85edca750405-kube-api-access-vbjvm\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:07 crc kubenswrapper[4895]: I0320 14:46:07.731463 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-4vtlx"] Mar 20 14:46:07 crc kubenswrapper[4895]: I0320 14:46:07.768967 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-4vtlx"] Mar 20 14:46:09 crc kubenswrapper[4895]: I0320 14:46:09.225263 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f9ea03-8b03-43c1-bb16-c5a66933bd62" path="/var/lib/kubelet/pods/d4f9ea03-8b03-43c1-bb16-c5a66933bd62/volumes" Mar 20 14:46:22 crc kubenswrapper[4895]: I0320 14:46:22.297207 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:46:22 crc kubenswrapper[4895]: I0320 14:46:22.297778 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:46:27 crc kubenswrapper[4895]: I0320 14:46:27.800432 4895 scope.go:117] "RemoveContainer" containerID="924a5dbbde731e0cc136aaf1e53e308aae193be5ce3d0ebf268897457d14ab89" Mar 20 14:46:31 crc kubenswrapper[4895]: I0320 14:46:31.298527 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-h6r5b_5d6b68aa-d45c-4f3a-806a-c1ba3f2d9dfe/cert-manager-controller/0.log" Mar 20 14:46:32 crc kubenswrapper[4895]: I0320 14:46:32.207864 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dm9rw_b5f5308f-82f4-432c-b1fa-b7a9554c691b/cert-manager-webhook/0.log" Mar 20 14:46:32 crc kubenswrapper[4895]: I0320 14:46:32.260627 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6drpc_fb60419b-e2e6-4f98-b5c8-846b4a670eb4/cert-manager-cainjector/0.log" Mar 20 14:46:52 crc kubenswrapper[4895]: I0320 14:46:52.297527 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:46:52 crc kubenswrapper[4895]: I0320 14:46:52.298114 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:46:52 crc kubenswrapper[4895]: I0320 14:46:52.298172 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:46:52 crc kubenswrapper[4895]: I0320 14:46:52.299027 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:46:52 crc kubenswrapper[4895]: I0320 14:46:52.299100 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" gracePeriod=600 Mar 20 14:46:52 crc kubenswrapper[4895]: E0320 14:46:52.465338 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:46:53 crc kubenswrapper[4895]: I0320 14:46:53.207494 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" exitCode=0 Mar 20 14:46:53 crc kubenswrapper[4895]: I0320 14:46:53.207540 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5"} Mar 20 14:46:53 crc kubenswrapper[4895]: I0320 14:46:53.207571 4895 scope.go:117] "RemoveContainer" containerID="3e9d83274de2ecb43ca975270c7f7d702f55d8f835a903422a816374d81b5e24" Mar 20 14:46:53 crc kubenswrapper[4895]: I0320 14:46:53.208229 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:46:53 crc kubenswrapper[4895]: E0320 14:46:53.208663 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:47:04 crc kubenswrapper[4895]: I0320 14:47:04.032606 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-h2wm7_2fb869f8-f1cb-4f72-8ef8-d1969ea326aa/nmstate-console-plugin/0.log" Mar 20 14:47:04 crc kubenswrapper[4895]: I0320 14:47:04.211545 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:47:04 crc kubenswrapper[4895]: E0320 14:47:04.211995 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:47:04 crc kubenswrapper[4895]: I0320 14:47:04.909057 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fmpgx_0a5d87c8-dc19-4250-b749-8827eb2de72f/nmstate-handler/0.log" Mar 20 14:47:05 crc kubenswrapper[4895]: I0320 14:47:05.031454 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zrhs4_44daea46-7374-461a-9926-c66cb642296d/kube-rbac-proxy/0.log" Mar 20 14:47:05 crc kubenswrapper[4895]: I0320 14:47:05.033706 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zrhs4_44daea46-7374-461a-9926-c66cb642296d/nmstate-metrics/0.log" Mar 20 14:47:05 crc kubenswrapper[4895]: I0320 14:47:05.495528 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-cxq4q_223467bd-bc95-4b07-96d3-0cda20ea5a3c/nmstate-operator/0.log" Mar 20 14:47:05 crc kubenswrapper[4895]: I0320 14:47:05.533845 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-xmmr9_7d6c7d25-f568-4cde-9716-c5fa4b4747b7/nmstate-webhook/0.log" Mar 20 14:47:18 crc kubenswrapper[4895]: I0320 14:47:18.211523 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:47:18 crc kubenswrapper[4895]: E0320 14:47:18.212421 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:47:27 crc kubenswrapper[4895]: I0320 14:47:27.732795 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="6b3c4f62-dc8a-49bd-b97e-d57133678e19" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:47:27 crc kubenswrapper[4895]: I0320 14:47:27.885875 4895 scope.go:117] "RemoveContainer" containerID="0134536c4bd4061074fa5b4fc912a61aff201d2779aafa83672f34ef96f14c18" Mar 20 14:47:29 crc kubenswrapper[4895]: I0320 14:47:29.212676 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:47:29 crc kubenswrapper[4895]: E0320 14:47:29.213232 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:47:38 crc kubenswrapper[4895]: I0320 14:47:38.603875 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dd7dbfbcf-qkd9d_bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca/kube-rbac-proxy/0.log" Mar 20 14:47:39 crc kubenswrapper[4895]: I0320 14:47:39.213716 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dd7dbfbcf-qkd9d_bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca/manager/0.log" Mar 20 14:47:42 crc kubenswrapper[4895]: I0320 14:47:42.211799 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:47:42 crc kubenswrapper[4895]: E0320 14:47:42.212489 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:47:57 crc kubenswrapper[4895]: I0320 14:47:57.212017 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:47:57 crc kubenswrapper[4895]: E0320 14:47:57.212865 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.159063 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566968-vkp5t"] Mar 20 14:48:00 crc kubenswrapper[4895]: E0320 14:48:00.160093 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70aff11-9c5c-4e3e-b78d-85edca750405" containerName="oc" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.160109 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70aff11-9c5c-4e3e-b78d-85edca750405" containerName="oc" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.160299 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70aff11-9c5c-4e3e-b78d-85edca750405" containerName="oc" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.161105 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.163257 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.163778 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.164695 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.173508 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-vkp5t"] Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.232852 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h82g\" (UniqueName: \"kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g\") pod \"auto-csr-approver-29566968-vkp5t\" (UID: \"bd6d77fb-5b57-4c86-a555-358411dc9392\") " pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.335818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h82g\" (UniqueName: \"kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g\") pod \"auto-csr-approver-29566968-vkp5t\" (UID: \"bd6d77fb-5b57-4c86-a555-358411dc9392\") " pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.357082 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h82g\" (UniqueName: \"kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g\") pod \"auto-csr-approver-29566968-vkp5t\" (UID: \"bd6d77fb-5b57-4c86-a555-358411dc9392\") " pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:00 crc kubenswrapper[4895]: I0320 14:48:00.482961 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:01 crc kubenswrapper[4895]: I0320 14:48:01.225408 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-vkp5t"] Mar 20 14:48:01 crc kubenswrapper[4895]: I0320 14:48:01.960660 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" event={"ID":"bd6d77fb-5b57-4c86-a555-358411dc9392","Type":"ContainerStarted","Data":"82998c9960815298b219d59e3cd448d71f6a44d97d78db777283ad0a2c8c8482"} Mar 20 14:48:02 crc kubenswrapper[4895]: I0320 14:48:02.970195 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" event={"ID":"bd6d77fb-5b57-4c86-a555-358411dc9392","Type":"ContainerStarted","Data":"bec683f63221d09824bf5c2847e5cba9d7ef3a49a2205279cc99e8ff7f324709"} Mar 20 14:48:02 crc kubenswrapper[4895]: I0320 14:48:02.998022 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" podStartSLOduration=1.9362613180000001 podStartE2EDuration="2.998005378s" podCreationTimestamp="2026-03-20 14:48:00 +0000 UTC" firstStartedPulling="2026-03-20 14:48:01.209853424 +0000 UTC m=+5180.719572390" lastFinishedPulling="2026-03-20 14:48:02.271597484 +0000 UTC m=+5181.781316450" observedRunningTime="2026-03-20 14:48:02.995724162 +0000 UTC m=+5182.505443158" watchObservedRunningTime="2026-03-20 14:48:02.998005378 +0000 UTC m=+5182.507724344" Mar 20 14:48:03 crc kubenswrapper[4895]: I0320 14:48:03.982093 4895 generic.go:334] "Generic (PLEG): container finished" podID="bd6d77fb-5b57-4c86-a555-358411dc9392" containerID="bec683f63221d09824bf5c2847e5cba9d7ef3a49a2205279cc99e8ff7f324709" exitCode=0 Mar 20 14:48:03 crc kubenswrapper[4895]: I0320 14:48:03.982194 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" event={"ID":"bd6d77fb-5b57-4c86-a555-358411dc9392","Type":"ContainerDied","Data":"bec683f63221d09824bf5c2847e5cba9d7ef3a49a2205279cc99e8ff7f324709"} Mar 20 14:48:06 crc kubenswrapper[4895]: I0320 14:48:06.819088 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:06 crc kubenswrapper[4895]: I0320 14:48:06.879915 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h82g\" (UniqueName: \"kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g\") pod \"bd6d77fb-5b57-4c86-a555-358411dc9392\" (UID: \"bd6d77fb-5b57-4c86-a555-358411dc9392\") " Mar 20 14:48:06 crc kubenswrapper[4895]: I0320 14:48:06.906808 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g" (OuterVolumeSpecName: "kube-api-access-8h82g") pod "bd6d77fb-5b57-4c86-a555-358411dc9392" (UID: "bd6d77fb-5b57-4c86-a555-358411dc9392"). InnerVolumeSpecName "kube-api-access-8h82g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:06 crc kubenswrapper[4895]: I0320 14:48:06.985786 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h82g\" (UniqueName: \"kubernetes.io/projected/bd6d77fb-5b57-4c86-a555-358411dc9392-kube-api-access-8h82g\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:07 crc kubenswrapper[4895]: I0320 14:48:07.015254 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" event={"ID":"bd6d77fb-5b57-4c86-a555-358411dc9392","Type":"ContainerDied","Data":"82998c9960815298b219d59e3cd448d71f6a44d97d78db777283ad0a2c8c8482"} Mar 20 14:48:07 crc kubenswrapper[4895]: I0320 14:48:07.015294 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82998c9960815298b219d59e3cd448d71f6a44d97d78db777283ad0a2c8c8482" Mar 20 14:48:07 crc kubenswrapper[4895]: I0320 14:48:07.015345 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-vkp5t" Mar 20 14:48:07 crc kubenswrapper[4895]: I0320 14:48:07.899305 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-b5kzh"] Mar 20 14:48:07 crc kubenswrapper[4895]: I0320 14:48:07.908830 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-b5kzh"] Mar 20 14:48:09 crc kubenswrapper[4895]: I0320 14:48:09.245893 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f07f4f-3d46-49e1-87d9-177e65622278" path="/var/lib/kubelet/pods/c5f07f4f-3d46-49e1-87d9-177e65622278/volumes" Mar 20 14:48:09 crc kubenswrapper[4895]: I0320 14:48:09.720214 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-25wpm_94168b5c-bd4f-4ad1-a35e-6844e0416997/prometheus-operator/0.log" Mar 20 14:48:10 crc kubenswrapper[4895]: I0320 14:48:10.095380 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_e4559a68-4eab-4215-835d-37fc5f2ae439/prometheus-operator-admission-webhook/0.log" Mar 20 14:48:10 crc kubenswrapper[4895]: I0320 14:48:10.124489 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_029af23c-4a48-4160-bfd7-650a11a211dd/prometheus-operator-admission-webhook/0.log" Mar 20 14:48:10 crc kubenswrapper[4895]: I0320 14:48:10.211690 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:48:10 crc kubenswrapper[4895]: E0320 14:48:10.211950 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:48:10 crc kubenswrapper[4895]: I0320 14:48:10.605745 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-b4g5j_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b/operator/0.log" Mar 20 14:48:10 crc kubenswrapper[4895]: I0320 14:48:10.647803 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5df5f8d6f4-7jrjf_b024edb6-69fb-4c9f-927f-59a137df1c0f/perses-operator/0.log" Mar 20 14:48:23 crc kubenswrapper[4895]: I0320 14:48:23.212345 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:48:23 crc kubenswrapper[4895]: E0320 14:48:23.213355 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.166663 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:26 crc kubenswrapper[4895]: E0320 14:48:26.167715 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6d77fb-5b57-4c86-a555-358411dc9392" containerName="oc" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.167732 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6d77fb-5b57-4c86-a555-358411dc9392" containerName="oc" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.181139 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6d77fb-5b57-4c86-a555-358411dc9392" containerName="oc" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.183899 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.226308 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.300171 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.300295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.300425 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xr5g\" (UniqueName: \"kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.402746 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.402919 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xr5g\" (UniqueName: \"kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.403099 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.403304 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.403524 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.425345 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xr5g\" (UniqueName: \"kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g\") pod \"certified-operators-tmpvh\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:26 crc kubenswrapper[4895]: I0320 14:48:26.506259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:27 crc kubenswrapper[4895]: I0320 14:48:27.304114 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:27 crc kubenswrapper[4895]: I0320 14:48:27.966747 4895 scope.go:117] "RemoveContainer" containerID="ac269453e156e5d0eceff789b3f8b9cfec1675b772fe2543013a9ef7672cb7ad" Mar 20 14:48:27 crc kubenswrapper[4895]: I0320 14:48:27.999333 4895 scope.go:117] "RemoveContainer" containerID="132e8c907f12cd9d7eef4c7cc1e385c11ff6101346ca010c9e1ac3172ebf1f2f" Mar 20 14:48:28 crc kubenswrapper[4895]: I0320 14:48:28.263776 4895 generic.go:334] "Generic (PLEG): container finished" podID="72e12b43-3016-498e-b696-142ba96631aa" containerID="c42fa369bbe6da7db3177fa2a59a1af6878386a5a47a7677ac4d778bedb5a742" exitCode=0 Mar 20 14:48:28 crc kubenswrapper[4895]: I0320 14:48:28.263831 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerDied","Data":"c42fa369bbe6da7db3177fa2a59a1af6878386a5a47a7677ac4d778bedb5a742"} Mar 20 14:48:28 crc kubenswrapper[4895]: I0320 14:48:28.263862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerStarted","Data":"fc9625f6d313bc5b38539f76050c7ca771ce0d0003754280d7b7b7accf5500ad"} Mar 20 14:48:30 crc kubenswrapper[4895]: I0320 14:48:30.291709 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerStarted","Data":"e40c5c7994f7c725a8c4f7bb6dcbe04989ca5c422a69602438cbfe2e17b8207b"} Mar 20 14:48:31 crc kubenswrapper[4895]: I0320 14:48:31.313856 4895 generic.go:334] "Generic (PLEG): container finished" podID="72e12b43-3016-498e-b696-142ba96631aa" containerID="e40c5c7994f7c725a8c4f7bb6dcbe04989ca5c422a69602438cbfe2e17b8207b" exitCode=0 Mar 20 14:48:31 crc kubenswrapper[4895]: I0320 14:48:31.314198 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerDied","Data":"e40c5c7994f7c725a8c4f7bb6dcbe04989ca5c422a69602438cbfe2e17b8207b"} Mar 20 14:48:32 crc kubenswrapper[4895]: I0320 14:48:32.325622 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerStarted","Data":"8eb4c0e28ba7e4ba2e47f47960a7ff9838bcbb3b24c38cf5bd3c745e2145fa39"} Mar 20 14:48:32 crc kubenswrapper[4895]: I0320 14:48:32.355191 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tmpvh" podStartSLOduration=2.844270811 podStartE2EDuration="6.35517018s" podCreationTimestamp="2026-03-20 14:48:26 +0000 UTC" firstStartedPulling="2026-03-20 14:48:28.266213102 +0000 UTC m=+5207.775932068" lastFinishedPulling="2026-03-20 14:48:31.777112471 +0000 UTC m=+5211.286831437" observedRunningTime="2026-03-20 14:48:32.347165094 +0000 UTC m=+5211.856884060" watchObservedRunningTime="2026-03-20 14:48:32.35517018 +0000 UTC m=+5211.864889146" Mar 20 14:48:36 crc kubenswrapper[4895]: I0320 14:48:36.213315 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:48:36 crc kubenswrapper[4895]: E0320 14:48:36.214108 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:48:36 crc kubenswrapper[4895]: I0320 14:48:36.507783 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:36 crc kubenswrapper[4895]: I0320 14:48:36.508074 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:36 crc kubenswrapper[4895]: I0320 14:48:36.562622 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:37 crc kubenswrapper[4895]: I0320 14:48:37.442731 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:37 crc kubenswrapper[4895]: I0320 14:48:37.493980 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:39 crc kubenswrapper[4895]: I0320 14:48:39.403429 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tmpvh" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="registry-server" containerID="cri-o://8eb4c0e28ba7e4ba2e47f47960a7ff9838bcbb3b24c38cf5bd3c745e2145fa39" gracePeriod=2 Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.424211 4895 generic.go:334] "Generic (PLEG): container finished" podID="72e12b43-3016-498e-b696-142ba96631aa" containerID="8eb4c0e28ba7e4ba2e47f47960a7ff9838bcbb3b24c38cf5bd3c745e2145fa39" exitCode=0 Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.424532 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerDied","Data":"8eb4c0e28ba7e4ba2e47f47960a7ff9838bcbb3b24c38cf5bd3c745e2145fa39"} Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.713611 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.751560 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-rfwwx_bd9dac02-de3d-43fb-9046-5280d8131d3b/kube-rbac-proxy/0.log" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.846356 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities\") pod \"72e12b43-3016-498e-b696-142ba96631aa\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.846488 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xr5g\" (UniqueName: \"kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g\") pod \"72e12b43-3016-498e-b696-142ba96631aa\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.846559 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content\") pod \"72e12b43-3016-498e-b696-142ba96631aa\" (UID: \"72e12b43-3016-498e-b696-142ba96631aa\") " Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.847884 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities" (OuterVolumeSpecName: "utilities") pod "72e12b43-3016-498e-b696-142ba96631aa" (UID: "72e12b43-3016-498e-b696-142ba96631aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.859981 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-rfwwx_bd9dac02-de3d-43fb-9046-5280d8131d3b/controller/0.log" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.872688 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g" (OuterVolumeSpecName: "kube-api-access-2xr5g") pod "72e12b43-3016-498e-b696-142ba96631aa" (UID: "72e12b43-3016-498e-b696-142ba96631aa"). InnerVolumeSpecName "kube-api-access-2xr5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.948859 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.948890 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xr5g\" (UniqueName: \"kubernetes.io/projected/72e12b43-3016-498e-b696-142ba96631aa-kube-api-access-2xr5g\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:40 crc kubenswrapper[4895]: I0320 14:48:40.953828 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72e12b43-3016-498e-b696-142ba96631aa" (UID: "72e12b43-3016-498e-b696-142ba96631aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.050623 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e12b43-3016-498e-b696-142ba96631aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.179618 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-frr-files/0.log" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.436213 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmpvh" event={"ID":"72e12b43-3016-498e-b696-142ba96631aa","Type":"ContainerDied","Data":"fc9625f6d313bc5b38539f76050c7ca771ce0d0003754280d7b7b7accf5500ad"} Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.436564 4895 scope.go:117] "RemoveContainer" containerID="8eb4c0e28ba7e4ba2e47f47960a7ff9838bcbb3b24c38cf5bd3c745e2145fa39" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.436720 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmpvh" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.458535 4895 scope.go:117] "RemoveContainer" containerID="e40c5c7994f7c725a8c4f7bb6dcbe04989ca5c422a69602438cbfe2e17b8207b" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.463432 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.480645 4895 scope.go:117] "RemoveContainer" containerID="c42fa369bbe6da7db3177fa2a59a1af6878386a5a47a7677ac4d778bedb5a742" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.490641 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tmpvh"] Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.675275 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-reloader/0.log" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.826015 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-metrics/0.log" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.834472 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-frr-files/0.log" Mar 20 14:48:41 crc kubenswrapper[4895]: I0320 14:48:41.954861 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-reloader/0.log" Mar 20 14:48:42 crc kubenswrapper[4895]: I0320 14:48:42.531408 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-metrics/0.log" Mar 20 14:48:42 crc kubenswrapper[4895]: I0320 14:48:42.650300 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-frr-files/0.log" Mar 20 14:48:42 crc kubenswrapper[4895]: I0320 14:48:42.732026 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-reloader/0.log" Mar 20 14:48:42 crc kubenswrapper[4895]: I0320 14:48:42.794031 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-metrics/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.067191 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-frr-files/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.123611 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/controller/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.154955 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-metrics/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.223151 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e12b43-3016-498e-b696-142ba96631aa" path="/var/lib/kubelet/pods/72e12b43-3016-498e-b696-142ba96631aa/volumes" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.267254 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/cp-reloader/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.418943 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/frr-metrics/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.842730 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/kube-rbac-proxy/0.log" Mar 20 14:48:43 crc kubenswrapper[4895]: I0320 14:48:43.905430 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/kube-rbac-proxy-frr/0.log" Mar 20 14:48:44 crc kubenswrapper[4895]: I0320 14:48:44.448502 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/reloader/0.log" Mar 20 14:48:44 crc kubenswrapper[4895]: I0320 14:48:44.493853 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-5g6cb_9f90de29-5eaa-4d44-8988-4623460dc401/frr-k8s-webhook-server/0.log" Mar 20 14:48:44 crc kubenswrapper[4895]: I0320 14:48:44.892782 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7864bd6b9f-n862x_434eee1f-2ee9-41fd-b97a-9f995142dcfc/manager/0.log" Mar 20 14:48:45 crc kubenswrapper[4895]: I0320 14:48:45.212461 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bg9zg_0d956767-71fa-4b4f-b113-659286aa149c/frr/0.log" Mar 20 14:48:45 crc kubenswrapper[4895]: I0320 14:48:45.360699 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67445d8464-hmtkr_a9b00cd0-442e-401d-b062-4a9bd5ae35f9/webhook-server/0.log" Mar 20 14:48:45 crc kubenswrapper[4895]: I0320 14:48:45.448532 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fm5cg_1b17d41e-b2b9-471a-9b0a-1b12899ba46b/kube-rbac-proxy/0.log" Mar 20 14:48:46 crc kubenswrapper[4895]: I0320 14:48:46.079867 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fm5cg_1b17d41e-b2b9-471a-9b0a-1b12899ba46b/speaker/0.log" Mar 20 14:48:51 crc kubenswrapper[4895]: I0320 14:48:51.228160 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:48:51 crc kubenswrapper[4895]: E0320 14:48:51.230031 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:49:06 crc kubenswrapper[4895]: I0320 14:49:06.212223 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:49:06 crc kubenswrapper[4895]: E0320 14:49:06.212961 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.485515 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:11 crc kubenswrapper[4895]: E0320 14:49:11.486278 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="registry-server" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.486290 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="registry-server" Mar 20 14:49:11 crc kubenswrapper[4895]: E0320 14:49:11.486316 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="extract-content" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.486322 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="extract-content" Mar 20 14:49:11 crc kubenswrapper[4895]: E0320 14:49:11.486346 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="extract-utilities" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.486352 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="extract-utilities" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.486554 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e12b43-3016-498e-b696-142ba96631aa" containerName="registry-server" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.487952 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.502945 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.528558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rn5\" (UniqueName: \"kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.528728 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.529049 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.631560 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.631696 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.631718 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rn5\" (UniqueName: \"kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.632186 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.632257 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.655147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rn5\" (UniqueName: \"kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5\") pod \"redhat-operators-knmrx\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:11 crc kubenswrapper[4895]: I0320 14:49:11.813742 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:12 crc kubenswrapper[4895]: I0320 14:49:12.621851 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:12 crc kubenswrapper[4895]: I0320 14:49:12.809922 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerStarted","Data":"e26aa1c6cb2080fc3b5902fb0e97a311e3e85515f00beb36fad3448cc5a59290"} Mar 20 14:49:13 crc kubenswrapper[4895]: I0320 14:49:13.755406 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/util/0.log" Mar 20 14:49:13 crc kubenswrapper[4895]: I0320 14:49:13.851276 4895 generic.go:334] "Generic (PLEG): container finished" podID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerID="867e16cc4f6317f95844e666d0dae9a6960c51bfdd7509b98a2d09c0c8fe0953" exitCode=0 Mar 20 14:49:13 crc kubenswrapper[4895]: I0320 14:49:13.851330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerDied","Data":"867e16cc4f6317f95844e666d0dae9a6960c51bfdd7509b98a2d09c0c8fe0953"} Mar 20 14:49:13 crc kubenswrapper[4895]: I0320 14:49:13.855698 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:49:14 crc kubenswrapper[4895]: I0320 14:49:14.125510 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/pull/0.log" Mar 20 14:49:14 crc kubenswrapper[4895]: I0320 14:49:14.229957 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/util/0.log" Mar 20 14:49:14 crc kubenswrapper[4895]: I0320 14:49:14.423958 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/pull/0.log" Mar 20 14:49:14 crc kubenswrapper[4895]: I0320 14:49:14.848688 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/util/0.log" Mar 20 14:49:14 crc kubenswrapper[4895]: I0320 14:49:14.978165 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/pull/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.044163 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745xl5k_bbb0cccd-3628-4791-a0e2-c4042fb00e33/extract/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.316093 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/util/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.642328 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/pull/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.737961 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/util/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.789274 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/pull/0.log" Mar 20 14:49:15 crc kubenswrapper[4895]: I0320 14:49:15.873777 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerStarted","Data":"3b5b816ee049c67db9ecc020264752a9bffc38037feeb474b4c96c0e44721afc"} Mar 20 14:49:16 crc kubenswrapper[4895]: I0320 14:49:16.052009 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/util/0.log" Mar 20 14:49:16 crc kubenswrapper[4895]: I0320 14:49:16.170163 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/pull/0.log" Mar 20 14:49:16 crc kubenswrapper[4895]: I0320 14:49:16.278714 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1wwvhc_b50bb0b3-ed45-4db8-b26d-33f0c8d3cfaf/extract/0.log" Mar 20 14:49:16 crc kubenswrapper[4895]: I0320 14:49:16.514079 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/util/0.log" Mar 20 14:49:16 crc kubenswrapper[4895]: I0320 14:49:16.889006 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/util/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.041343 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/pull/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.076744 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/pull/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.518471 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/util/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.637648 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/extract/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.700532 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397268knfc_16d1df97-cdad-4acd-8aa4-66383ab645cc/pull/0.log" Mar 20 14:49:17 crc kubenswrapper[4895]: I0320 14:49:17.956193 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/util/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.171612 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/pull/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.203334 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/pull/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.256411 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/util/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.399565 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/util/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.403488 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/pull/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.418803 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcbthlz_cec8c6d6-c364-4bb7-aea4-931b5b4774e1/extract/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.863809 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-utilities/0.log" Mar 20 14:49:18 crc kubenswrapper[4895]: I0320 14:49:18.968130 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-utilities/0.log" Mar 20 14:49:19 crc kubenswrapper[4895]: I0320 14:49:19.064757 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-content/0.log" Mar 20 14:49:19 crc kubenswrapper[4895]: I0320 14:49:19.065854 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-content/0.log" Mar 20 14:49:19 crc kubenswrapper[4895]: I0320 14:49:19.382129 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-utilities/0.log" Mar 20 14:49:19 crc kubenswrapper[4895]: I0320 14:49:19.498657 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/extract-content/0.log" Mar 20 14:49:19 crc kubenswrapper[4895]: I0320 14:49:19.840736 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6l6m_27bd4b2f-9705-45e5-b579-04e05fb9abde/registry-server/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.034971 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-utilities/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.211260 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:49:20 crc kubenswrapper[4895]: E0320 14:49:20.211744 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.289326 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-utilities/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.396762 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-content/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.566819 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-content/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.761220 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-content/0.log" Mar 20 14:49:20 crc kubenswrapper[4895]: I0320 14:49:20.785532 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/extract-utilities/0.log" Mar 20 14:49:21 crc kubenswrapper[4895]: I0320 14:49:21.178323 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q72qk_8d7099f0-1367-48df-962f-2a7d34147dc9/marketplace-operator/0.log" Mar 20 14:49:21 crc kubenswrapper[4895]: I0320 14:49:21.346554 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6wcbv_045dcbb1-ad32-451d-ad2d-1f2c243bb0ee/registry-server/0.log" Mar 20 14:49:21 crc kubenswrapper[4895]: I0320 14:49:21.416193 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-utilities/0.log" Mar 20 14:49:21 crc kubenswrapper[4895]: I0320 14:49:21.790137 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-content/0.log" Mar 20 14:49:21 crc kubenswrapper[4895]: I0320 14:49:21.934724 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-content/0.log" Mar 20 14:49:22 crc kubenswrapper[4895]: I0320 14:49:22.380044 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-utilities/0.log" Mar 20 14:49:22 crc kubenswrapper[4895]: I0320 14:49:22.791363 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-utilities/0.log" Mar 20 14:49:22 crc kubenswrapper[4895]: I0320 14:49:22.958733 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/registry-server/0.log" Mar 20 14:49:22 crc kubenswrapper[4895]: I0320 14:49:22.959880 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xnqrs_5e11a404-9171-42bf-83c6-341c3db05c44/extract-content/0.log" Mar 20 14:49:23 crc kubenswrapper[4895]: I0320 14:49:23.416627 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-utilities/0.log" Mar 20 14:49:23 crc kubenswrapper[4895]: I0320 14:49:23.886825 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-content/0.log" Mar 20 14:49:23 crc kubenswrapper[4895]: I0320 14:49:23.949954 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-utilities/0.log" Mar 20 14:49:23 crc kubenswrapper[4895]: I0320 14:49:23.958256 4895 generic.go:334] "Generic (PLEG): container finished" podID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerID="3b5b816ee049c67db9ecc020264752a9bffc38037feeb474b4c96c0e44721afc" exitCode=0 Mar 20 14:49:23 crc kubenswrapper[4895]: I0320 14:49:23.958295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerDied","Data":"3b5b816ee049c67db9ecc020264752a9bffc38037feeb474b4c96c0e44721afc"} Mar 20 14:49:24 crc kubenswrapper[4895]: I0320 14:49:24.176765 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-content/0.log" Mar 20 14:49:24 crc kubenswrapper[4895]: I0320 14:49:24.516174 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-utilities/0.log" Mar 20 14:49:24 crc kubenswrapper[4895]: I0320 14:49:24.618631 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/extract-content/0.log" Mar 20 14:49:24 crc kubenswrapper[4895]: I0320 14:49:24.632319 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-utilities/0.log" Mar 20 14:49:24 crc kubenswrapper[4895]: I0320 14:49:24.979313 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerStarted","Data":"831df1875df594dfa60655a83b967d2ae5cd348817d483b519140f94fdc105f6"} Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.009567 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knmrx" podStartSLOduration=3.34864886 podStartE2EDuration="14.009539824s" podCreationTimestamp="2026-03-20 14:49:11 +0000 UTC" firstStartedPulling="2026-03-20 14:49:13.855404714 +0000 UTC m=+5253.365123690" lastFinishedPulling="2026-03-20 14:49:24.516295678 +0000 UTC m=+5264.026014654" observedRunningTime="2026-03-20 14:49:25.002365599 +0000 UTC m=+5264.512084585" watchObservedRunningTime="2026-03-20 14:49:25.009539824 +0000 UTC m=+5264.519258790" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.052853 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-content/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.120282 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-content/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.200856 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-utilities/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.310049 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jk92b_602120eb-39e8-4b29-a2f9-7ff1fe5a0222/registry-server/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.655017 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-utilities/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.851067 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/extract-content/0.log" Mar 20 14:49:25 crc kubenswrapper[4895]: I0320 14:49:25.918039 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-knmrx_332c5178-90fd-43d2-8e83-fbdeb958a0f4/registry-server/0.log" Mar 20 14:49:31 crc kubenswrapper[4895]: I0320 14:49:31.815031 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:31 crc kubenswrapper[4895]: I0320 14:49:31.816525 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:33 crc kubenswrapper[4895]: I0320 14:49:33.512677 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knmrx" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" probeResult="failure" output=< Mar 20 14:49:33 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:49:33 crc kubenswrapper[4895]: > Mar 20 14:49:34 crc kubenswrapper[4895]: I0320 14:49:34.212066 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:49:34 crc kubenswrapper[4895]: E0320 14:49:34.212663 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:49:42 crc kubenswrapper[4895]: I0320 14:49:42.976678 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knmrx" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" probeResult="failure" output=< Mar 20 14:49:42 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Mar 20 14:49:42 crc kubenswrapper[4895]: > Mar 20 14:49:49 crc kubenswrapper[4895]: I0320 14:49:49.211692 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:49:49 crc kubenswrapper[4895]: E0320 14:49:49.212703 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:49:52 crc kubenswrapper[4895]: I0320 14:49:52.390104 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:52 crc kubenswrapper[4895]: I0320 14:49:52.480752 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:52 crc kubenswrapper[4895]: I0320 14:49:52.634427 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:53 crc kubenswrapper[4895]: I0320 14:49:53.839835 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-25wpm_94168b5c-bd4f-4ad1-a35e-6844e0416997/prometheus-operator/0.log" Mar 20 14:49:53 crc kubenswrapper[4895]: I0320 14:49:53.935239 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-855dcf8c6c-xfrrd_e4559a68-4eab-4215-835d-37fc5f2ae439/prometheus-operator-admission-webhook/0.log" Mar 20 14:49:53 crc kubenswrapper[4895]: I0320 14:49:53.941776 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-855dcf8c6c-zjvrm_029af23c-4a48-4160-bfd7-650a11a211dd/prometheus-operator-admission-webhook/0.log" Mar 20 14:49:54 crc kubenswrapper[4895]: I0320 14:49:54.218731 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-b4g5j_875ae527-38d2-4d9c-bcdc-5ca7f9a9f17b/operator/0.log" Mar 20 14:49:54 crc kubenswrapper[4895]: I0320 14:49:54.326784 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knmrx" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" containerID="cri-o://831df1875df594dfa60655a83b967d2ae5cd348817d483b519140f94fdc105f6" gracePeriod=2 Mar 20 14:49:54 crc kubenswrapper[4895]: I0320 14:49:54.426081 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5df5f8d6f4-7jrjf_b024edb6-69fb-4c9f-927f-59a137df1c0f/perses-operator/0.log" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.336088 4895 generic.go:334] "Generic (PLEG): container finished" podID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerID="831df1875df594dfa60655a83b967d2ae5cd348817d483b519140f94fdc105f6" exitCode=0 Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.336171 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerDied","Data":"831df1875df594dfa60655a83b967d2ae5cd348817d483b519140f94fdc105f6"} Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.742860 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.842523 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content\") pod \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.842673 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities\") pod \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.842837 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74rn5\" (UniqueName: \"kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5\") pod \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\" (UID: \"332c5178-90fd-43d2-8e83-fbdeb958a0f4\") " Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.846053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities" (OuterVolumeSpecName: "utilities") pod "332c5178-90fd-43d2-8e83-fbdeb958a0f4" (UID: "332c5178-90fd-43d2-8e83-fbdeb958a0f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.853434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5" (OuterVolumeSpecName: "kube-api-access-74rn5") pod "332c5178-90fd-43d2-8e83-fbdeb958a0f4" (UID: "332c5178-90fd-43d2-8e83-fbdeb958a0f4"). InnerVolumeSpecName "kube-api-access-74rn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.945580 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.945616 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74rn5\" (UniqueName: \"kubernetes.io/projected/332c5178-90fd-43d2-8e83-fbdeb958a0f4-kube-api-access-74rn5\") on node \"crc\" DevicePath \"\"" Mar 20 14:49:55 crc kubenswrapper[4895]: I0320 14:49:55.996150 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "332c5178-90fd-43d2-8e83-fbdeb958a0f4" (UID: "332c5178-90fd-43d2-8e83-fbdeb958a0f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.047922 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332c5178-90fd-43d2-8e83-fbdeb958a0f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.348329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmrx" event={"ID":"332c5178-90fd-43d2-8e83-fbdeb958a0f4","Type":"ContainerDied","Data":"e26aa1c6cb2080fc3b5902fb0e97a311e3e85515f00beb36fad3448cc5a59290"} Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.348379 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmrx" Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.348411 4895 scope.go:117] "RemoveContainer" containerID="831df1875df594dfa60655a83b967d2ae5cd348817d483b519140f94fdc105f6" Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.395256 4895 scope.go:117] "RemoveContainer" containerID="3b5b816ee049c67db9ecc020264752a9bffc38037feeb474b4c96c0e44721afc" Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.399436 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.427969 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knmrx"] Mar 20 14:49:56 crc kubenswrapper[4895]: I0320 14:49:56.434537 4895 scope.go:117] "RemoveContainer" containerID="867e16cc4f6317f95844e666d0dae9a6960c51bfdd7509b98a2d09c0c8fe0953" Mar 20 14:49:57 crc kubenswrapper[4895]: I0320 14:49:57.223889 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" path="/var/lib/kubelet/pods/332c5178-90fd-43d2-8e83-fbdeb958a0f4/volumes" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.145373 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566970-zssq9"] Mar 20 14:50:00 crc kubenswrapper[4895]: E0320 14:50:00.146442 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.146456 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4895]: E0320 14:50:00.146470 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.146477 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4895]: E0320 14:50:00.146523 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.146531 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.146774 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="332c5178-90fd-43d2-8e83-fbdeb958a0f4" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.148275 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.150540 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.150629 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.151079 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.170771 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-zssq9"] Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.233035 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dkff\" (UniqueName: \"kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff\") pod \"auto-csr-approver-29566970-zssq9\" (UID: \"ed4cc896-130e-48d3-86c3-c56c202cd7e8\") " pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.335187 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dkff\" (UniqueName: \"kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff\") pod \"auto-csr-approver-29566970-zssq9\" (UID: \"ed4cc896-130e-48d3-86c3-c56c202cd7e8\") " pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.357385 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dkff\" (UniqueName: \"kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff\") pod \"auto-csr-approver-29566970-zssq9\" (UID: \"ed4cc896-130e-48d3-86c3-c56c202cd7e8\") " pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:00 crc kubenswrapper[4895]: I0320 14:50:00.476824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:01 crc kubenswrapper[4895]: I0320 14:50:01.222305 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:50:01 crc kubenswrapper[4895]: E0320 14:50:01.223237 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:50:01 crc kubenswrapper[4895]: I0320 14:50:01.356720 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-zssq9"] Mar 20 14:50:02 crc kubenswrapper[4895]: I0320 14:50:02.453697 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-zssq9" event={"ID":"ed4cc896-130e-48d3-86c3-c56c202cd7e8","Type":"ContainerStarted","Data":"2570abe1178a86c7109d5f44a2bbb4659fdd2017c46b4e97d3e9d96d0bdbbe27"} Mar 20 14:50:04 crc kubenswrapper[4895]: I0320 14:50:04.489746 4895 generic.go:334] "Generic (PLEG): container finished" podID="ed4cc896-130e-48d3-86c3-c56c202cd7e8" containerID="53a1be03390d3192e2aab06b0c8dccd3382fd3db581451f841ddefb08b658da7" exitCode=0 Mar 20 14:50:04 crc kubenswrapper[4895]: I0320 14:50:04.489885 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-zssq9" event={"ID":"ed4cc896-130e-48d3-86c3-c56c202cd7e8","Type":"ContainerDied","Data":"53a1be03390d3192e2aab06b0c8dccd3382fd3db581451f841ddefb08b658da7"} Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.516063 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-zssq9" event={"ID":"ed4cc896-130e-48d3-86c3-c56c202cd7e8","Type":"ContainerDied","Data":"2570abe1178a86c7109d5f44a2bbb4659fdd2017c46b4e97d3e9d96d0bdbbe27"} Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.516546 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2570abe1178a86c7109d5f44a2bbb4659fdd2017c46b4e97d3e9d96d0bdbbe27" Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.569847 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.675041 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dkff\" (UniqueName: \"kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff\") pod \"ed4cc896-130e-48d3-86c3-c56c202cd7e8\" (UID: \"ed4cc896-130e-48d3-86c3-c56c202cd7e8\") " Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.696787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff" (OuterVolumeSpecName: "kube-api-access-8dkff") pod "ed4cc896-130e-48d3-86c3-c56c202cd7e8" (UID: "ed4cc896-130e-48d3-86c3-c56c202cd7e8"). InnerVolumeSpecName "kube-api-access-8dkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:50:06 crc kubenswrapper[4895]: I0320 14:50:06.777350 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dkff\" (UniqueName: \"kubernetes.io/projected/ed4cc896-130e-48d3-86c3-c56c202cd7e8-kube-api-access-8dkff\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:07 crc kubenswrapper[4895]: I0320 14:50:07.524019 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-zssq9" Mar 20 14:50:07 crc kubenswrapper[4895]: I0320 14:50:07.646093 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hlw62"] Mar 20 14:50:07 crc kubenswrapper[4895]: I0320 14:50:07.655318 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hlw62"] Mar 20 14:50:09 crc kubenswrapper[4895]: I0320 14:50:09.227487 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe1df06-4d13-45e2-898d-deae9f895a68" path="/var/lib/kubelet/pods/7fe1df06-4d13-45e2-898d-deae9f895a68/volumes" Mar 20 14:50:13 crc kubenswrapper[4895]: I0320 14:50:13.212027 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:50:13 crc kubenswrapper[4895]: E0320 14:50:13.212864 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:50:24 crc kubenswrapper[4895]: I0320 14:50:24.764666 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dd7dbfbcf-qkd9d_bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca/kube-rbac-proxy/0.log" Mar 20 14:50:25 crc kubenswrapper[4895]: I0320 14:50:25.121080 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dd7dbfbcf-qkd9d_bc3a70cf-05df-46d6-9e5c-25fc9a0d8aca/manager/0.log" Mar 20 14:50:25 crc kubenswrapper[4895]: I0320 14:50:25.214700 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:50:25 crc kubenswrapper[4895]: E0320 14:50:25.214932 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:50:28 crc kubenswrapper[4895]: I0320 14:50:28.149755 4895 scope.go:117] "RemoveContainer" containerID="2a57fb5b93e65149b21a8138232395c0061ed5560da7ea565817de27b71d7f52" Mar 20 14:50:40 crc kubenswrapper[4895]: I0320 14:50:40.211429 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:50:40 crc kubenswrapper[4895]: E0320 14:50:40.213164 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:50:53 crc kubenswrapper[4895]: I0320 14:50:53.211525 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:50:53 crc kubenswrapper[4895]: E0320 14:50:53.212279 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:51:07 crc kubenswrapper[4895]: I0320 14:51:07.212565 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:51:07 crc kubenswrapper[4895]: E0320 14:51:07.213445 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:51:20 crc kubenswrapper[4895]: I0320 14:51:20.220587 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:51:20 crc kubenswrapper[4895]: E0320 14:51:20.221888 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:51:34 crc kubenswrapper[4895]: I0320 14:51:34.211780 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:51:34 crc kubenswrapper[4895]: E0320 14:51:34.212579 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.116531 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:51:46 crc kubenswrapper[4895]: E0320 14:51:46.119724 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4cc896-130e-48d3-86c3-c56c202cd7e8" containerName="oc" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.119758 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4cc896-130e-48d3-86c3-c56c202cd7e8" containerName="oc" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.120091 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4cc896-130e-48d3-86c3-c56c202cd7e8" containerName="oc" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.123107 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.148693 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.237870 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.238223 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.238343 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hfv\" (UniqueName: \"kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.340314 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.340361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hfv\" (UniqueName: \"kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.340562 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.341244 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:46 crc kubenswrapper[4895]: I0320 14:51:46.341764 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:47 crc kubenswrapper[4895]: I0320 14:51:47.032200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hfv\" (UniqueName: \"kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv\") pod \"community-operators-jmgrj\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:47 crc kubenswrapper[4895]: I0320 14:51:47.052020 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:47 crc kubenswrapper[4895]: I0320 14:51:47.876924 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:51:48 crc kubenswrapper[4895]: I0320 14:51:48.211933 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:51:48 crc kubenswrapper[4895]: E0320 14:51:48.212377 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:51:48 crc kubenswrapper[4895]: I0320 14:51:48.868067 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerID="30d0440ad90f044706a81395a5dae782d5302be26b31fa05d19b3ea409cd78a0" exitCode=0 Mar 20 14:51:48 crc kubenswrapper[4895]: I0320 14:51:48.868119 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerDied","Data":"30d0440ad90f044706a81395a5dae782d5302be26b31fa05d19b3ea409cd78a0"} Mar 20 14:51:48 crc kubenswrapper[4895]: I0320 14:51:48.868148 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerStarted","Data":"6fd929f0740cc45515a24c037a4568c2c7ed9af7727938f2355a1bfeea684f58"} Mar 20 14:51:50 crc kubenswrapper[4895]: I0320 14:51:50.887284 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerStarted","Data":"0eea3b1e6a7f82fb0628f9ecc427524f18ed92d4c0c537e7fa1eeb73f06549b9"} Mar 20 14:51:52 crc kubenswrapper[4895]: I0320 14:51:52.911189 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerID="0eea3b1e6a7f82fb0628f9ecc427524f18ed92d4c0c537e7fa1eeb73f06549b9" exitCode=0 Mar 20 14:51:52 crc kubenswrapper[4895]: I0320 14:51:52.911272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerDied","Data":"0eea3b1e6a7f82fb0628f9ecc427524f18ed92d4c0c537e7fa1eeb73f06549b9"} Mar 20 14:51:53 crc kubenswrapper[4895]: I0320 14:51:53.923735 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerStarted","Data":"ad72af0708a2b3685fa7075b858ed42f670cfec5f51dc4edf2cec1a1dfa75d8f"} Mar 20 14:51:53 crc kubenswrapper[4895]: I0320 14:51:53.944841 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jmgrj" podStartSLOduration=3.467118429 podStartE2EDuration="7.944822999s" podCreationTimestamp="2026-03-20 14:51:46 +0000 UTC" firstStartedPulling="2026-03-20 14:51:48.869868453 +0000 UTC m=+5408.379587419" lastFinishedPulling="2026-03-20 14:51:53.347573033 +0000 UTC m=+5412.857291989" observedRunningTime="2026-03-20 14:51:53.943042336 +0000 UTC m=+5413.452761302" watchObservedRunningTime="2026-03-20 14:51:53.944822999 +0000 UTC m=+5413.454541965" Mar 20 14:51:57 crc kubenswrapper[4895]: I0320 14:51:57.052974 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:57 crc kubenswrapper[4895]: I0320 14:51:57.055575 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:57 crc kubenswrapper[4895]: I0320 14:51:57.113299 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:59 crc kubenswrapper[4895]: I0320 14:51:59.018015 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:51:59 crc kubenswrapper[4895]: I0320 14:51:59.076910 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:51:59 crc kubenswrapper[4895]: I0320 14:51:59.211484 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:51:59 crc kubenswrapper[4895]: I0320 14:51:59.980493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f"} Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.162462 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566972-jbc9n"] Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.164345 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.178239 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-jbc9n"] Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.186798 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.187171 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.187415 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.244823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7g4\" (UniqueName: \"kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4\") pod \"auto-csr-approver-29566972-jbc9n\" (UID: \"bdebf311-01b8-4474-b598-4458290a0100\") " pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.346871 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7g4\" (UniqueName: \"kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4\") pod \"auto-csr-approver-29566972-jbc9n\" (UID: \"bdebf311-01b8-4474-b598-4458290a0100\") " pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.368987 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7g4\" (UniqueName: \"kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4\") pod \"auto-csr-approver-29566972-jbc9n\" (UID: \"bdebf311-01b8-4474-b598-4458290a0100\") " pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.517159 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:00 crc kubenswrapper[4895]: I0320 14:52:00.989621 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jmgrj" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="registry-server" containerID="cri-o://ad72af0708a2b3685fa7075b858ed42f670cfec5f51dc4edf2cec1a1dfa75d8f" gracePeriod=2 Mar 20 14:52:01 crc kubenswrapper[4895]: I0320 14:52:01.402759 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-jbc9n"] Mar 20 14:52:01 crc kubenswrapper[4895]: I0320 14:52:01.999915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" event={"ID":"bdebf311-01b8-4474-b598-4458290a0100","Type":"ContainerStarted","Data":"df5ae8bdf1b6b360cc9ee044116306440730b0658bd727c7175932b5668be99c"} Mar 20 14:52:02 crc kubenswrapper[4895]: I0320 14:52:02.003144 4895 generic.go:334] "Generic (PLEG): container finished" podID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerID="ad72af0708a2b3685fa7075b858ed42f670cfec5f51dc4edf2cec1a1dfa75d8f" exitCode=0 Mar 20 14:52:02 crc kubenswrapper[4895]: I0320 14:52:02.003189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerDied","Data":"ad72af0708a2b3685fa7075b858ed42f670cfec5f51dc4edf2cec1a1dfa75d8f"} Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.014458 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" event={"ID":"bdebf311-01b8-4474-b598-4458290a0100","Type":"ContainerStarted","Data":"17a145f12d7b8c410c521ca54ec98b8622d106255fb9f384421e3962af7982d4"} Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.040999 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" podStartSLOduration=2.14902228 podStartE2EDuration="3.04097544s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="2026-03-20 14:52:01.426131542 +0000 UTC m=+5420.935850508" lastFinishedPulling="2026-03-20 14:52:02.318084712 +0000 UTC m=+5421.827803668" observedRunningTime="2026-03-20 14:52:03.029152581 +0000 UTC m=+5422.538871547" watchObservedRunningTime="2026-03-20 14:52:03.04097544 +0000 UTC m=+5422.550694406" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.406694 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.554570 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities\") pod \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.555798 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56hfv\" (UniqueName: \"kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv\") pod \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.555917 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content\") pod \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\" (UID: \"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5\") " Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.555637 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities" (OuterVolumeSpecName: "utilities") pod "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" (UID: "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.596729 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv" (OuterVolumeSpecName: "kube-api-access-56hfv") pod "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" (UID: "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5"). InnerVolumeSpecName "kube-api-access-56hfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.659274 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.659311 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56hfv\" (UniqueName: \"kubernetes.io/projected/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-kube-api-access-56hfv\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.691614 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" (UID: "0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:03 crc kubenswrapper[4895]: I0320 14:52:03.762148 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.026698 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmgrj" Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.027037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmgrj" event={"ID":"0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5","Type":"ContainerDied","Data":"6fd929f0740cc45515a24c037a4568c2c7ed9af7727938f2355a1bfeea684f58"} Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.027094 4895 scope.go:117] "RemoveContainer" containerID="ad72af0708a2b3685fa7075b858ed42f670cfec5f51dc4edf2cec1a1dfa75d8f" Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.028782 4895 generic.go:334] "Generic (PLEG): container finished" podID="bdebf311-01b8-4474-b598-4458290a0100" containerID="17a145f12d7b8c410c521ca54ec98b8622d106255fb9f384421e3962af7982d4" exitCode=0 Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.028827 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" event={"ID":"bdebf311-01b8-4474-b598-4458290a0100","Type":"ContainerDied","Data":"17a145f12d7b8c410c521ca54ec98b8622d106255fb9f384421e3962af7982d4"} Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.076087 4895 scope.go:117] "RemoveContainer" containerID="0eea3b1e6a7f82fb0628f9ecc427524f18ed92d4c0c537e7fa1eeb73f06549b9" Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.083228 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.101311 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jmgrj"] Mar 20 14:52:04 crc kubenswrapper[4895]: I0320 14:52:04.124364 4895 scope.go:117] "RemoveContainer" containerID="30d0440ad90f044706a81395a5dae782d5302be26b31fa05d19b3ea409cd78a0" Mar 20 14:52:05 crc kubenswrapper[4895]: I0320 14:52:05.240693 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" path="/var/lib/kubelet/pods/0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5/volumes" Mar 20 14:52:06 crc kubenswrapper[4895]: I0320 14:52:06.317015 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:06 crc kubenswrapper[4895]: I0320 14:52:06.435183 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7g4\" (UniqueName: \"kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4\") pod \"bdebf311-01b8-4474-b598-4458290a0100\" (UID: \"bdebf311-01b8-4474-b598-4458290a0100\") " Mar 20 14:52:06 crc kubenswrapper[4895]: I0320 14:52:06.463132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4" (OuterVolumeSpecName: "kube-api-access-vb7g4") pod "bdebf311-01b8-4474-b598-4458290a0100" (UID: "bdebf311-01b8-4474-b598-4458290a0100"). InnerVolumeSpecName "kube-api-access-vb7g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:06 crc kubenswrapper[4895]: I0320 14:52:06.538316 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7g4\" (UniqueName: \"kubernetes.io/projected/bdebf311-01b8-4474-b598-4458290a0100-kube-api-access-vb7g4\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:07 crc kubenswrapper[4895]: I0320 14:52:07.104493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" event={"ID":"bdebf311-01b8-4474-b598-4458290a0100","Type":"ContainerDied","Data":"df5ae8bdf1b6b360cc9ee044116306440730b0658bd727c7175932b5668be99c"} Mar 20 14:52:07 crc kubenswrapper[4895]: I0320 14:52:07.104708 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df5ae8bdf1b6b360cc9ee044116306440730b0658bd727c7175932b5668be99c" Mar 20 14:52:07 crc kubenswrapper[4895]: I0320 14:52:07.104785 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-jbc9n" Mar 20 14:52:07 crc kubenswrapper[4895]: I0320 14:52:07.401666 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-5fsn5"] Mar 20 14:52:07 crc kubenswrapper[4895]: I0320 14:52:07.412490 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-5fsn5"] Mar 20 14:52:09 crc kubenswrapper[4895]: I0320 14:52:09.242882 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70aff11-9c5c-4e3e-b78d-85edca750405" path="/var/lib/kubelet/pods/a70aff11-9c5c-4e3e-b78d-85edca750405/volumes" Mar 20 14:52:28 crc kubenswrapper[4895]: I0320 14:52:28.291318 4895 scope.go:117] "RemoveContainer" containerID="9b63fa02113b7d01900518a91f87a146bbe3861011d1dfeb8c48940ee894027c" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.033791 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:33 crc kubenswrapper[4895]: E0320 14:52:33.041504 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="extract-content" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041532 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="extract-content" Mar 20 14:52:33 crc kubenswrapper[4895]: E0320 14:52:33.041556 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdebf311-01b8-4474-b598-4458290a0100" containerName="oc" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041563 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdebf311-01b8-4474-b598-4458290a0100" containerName="oc" Mar 20 14:52:33 crc kubenswrapper[4895]: E0320 14:52:33.041581 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="registry-server" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041588 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="registry-server" Mar 20 14:52:33 crc kubenswrapper[4895]: E0320 14:52:33.041610 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="extract-utilities" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041616 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="extract-utilities" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041920 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd6e4cd-9d2e-4f1b-a070-3e2c0d6a79e5" containerName="registry-server" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.041938 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdebf311-01b8-4474-b598-4458290a0100" containerName="oc" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.043411 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.046327 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.147006 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4525j\" (UniqueName: \"kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.147371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.147519 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.249158 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.249227 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.249381 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4525j\" (UniqueName: \"kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.249670 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.249742 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.281869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4525j\" (UniqueName: \"kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j\") pod \"redhat-marketplace-lqlfx\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:33 crc kubenswrapper[4895]: I0320 14:52:33.362090 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:34 crc kubenswrapper[4895]: I0320 14:52:34.150250 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:34 crc kubenswrapper[4895]: I0320 14:52:34.422829 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerStarted","Data":"e7247c567106256438eb456f7d0e49831f68cc306112ef0a38de8ba956bda58a"} Mar 20 14:52:35 crc kubenswrapper[4895]: I0320 14:52:35.444479 4895 generic.go:334] "Generic (PLEG): container finished" podID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerID="70747841bc006888764b0f3ff3a09750cc6d281c9064a0b53140a3dcc7977608" exitCode=0 Mar 20 14:52:35 crc kubenswrapper[4895]: I0320 14:52:35.444606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerDied","Data":"70747841bc006888764b0f3ff3a09750cc6d281c9064a0b53140a3dcc7977608"} Mar 20 14:52:36 crc kubenswrapper[4895]: I0320 14:52:36.471641 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerStarted","Data":"97d190db3fa7b946835157aaaf205167f8dc9561c70a3b8beb2ebeb4b08276ab"} Mar 20 14:52:37 crc kubenswrapper[4895]: I0320 14:52:37.481956 4895 generic.go:334] "Generic (PLEG): container finished" podID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerID="97d190db3fa7b946835157aaaf205167f8dc9561c70a3b8beb2ebeb4b08276ab" exitCode=0 Mar 20 14:52:37 crc kubenswrapper[4895]: I0320 14:52:37.482188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerDied","Data":"97d190db3fa7b946835157aaaf205167f8dc9561c70a3b8beb2ebeb4b08276ab"} Mar 20 14:52:38 crc kubenswrapper[4895]: I0320 14:52:38.493300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerStarted","Data":"6b3c0aed99c1c82a6d0ab588b183c9ad69f11a132153a83b6d8130d69a62e984"} Mar 20 14:52:38 crc kubenswrapper[4895]: I0320 14:52:38.528855 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqlfx" podStartSLOduration=3.008824222 podStartE2EDuration="5.528835593s" podCreationTimestamp="2026-03-20 14:52:33 +0000 UTC" firstStartedPulling="2026-03-20 14:52:35.446822826 +0000 UTC m=+5454.956541792" lastFinishedPulling="2026-03-20 14:52:37.966834197 +0000 UTC m=+5457.476553163" observedRunningTime="2026-03-20 14:52:38.519196498 +0000 UTC m=+5458.028915474" watchObservedRunningTime="2026-03-20 14:52:38.528835593 +0000 UTC m=+5458.038554549" Mar 20 14:52:43 crc kubenswrapper[4895]: I0320 14:52:43.362748 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:43 crc kubenswrapper[4895]: I0320 14:52:43.363297 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:43 crc kubenswrapper[4895]: I0320 14:52:43.424009 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:43 crc kubenswrapper[4895]: I0320 14:52:43.609069 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:43 crc kubenswrapper[4895]: I0320 14:52:43.672890 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:45 crc kubenswrapper[4895]: I0320 14:52:45.568749 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqlfx" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="registry-server" containerID="cri-o://6b3c0aed99c1c82a6d0ab588b183c9ad69f11a132153a83b6d8130d69a62e984" gracePeriod=2 Mar 20 14:52:46 crc kubenswrapper[4895]: I0320 14:52:46.582057 4895 generic.go:334] "Generic (PLEG): container finished" podID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerID="6b3c0aed99c1c82a6d0ab588b183c9ad69f11a132153a83b6d8130d69a62e984" exitCode=0 Mar 20 14:52:46 crc kubenswrapper[4895]: I0320 14:52:46.582141 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerDied","Data":"6b3c0aed99c1c82a6d0ab588b183c9ad69f11a132153a83b6d8130d69a62e984"} Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.012252 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.083189 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content\") pod \"ed4603e8-0f1b-423d-ad3a-2394c1085368\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.083752 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities\") pod \"ed4603e8-0f1b-423d-ad3a-2394c1085368\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.084029 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4525j\" (UniqueName: \"kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j\") pod \"ed4603e8-0f1b-423d-ad3a-2394c1085368\" (UID: \"ed4603e8-0f1b-423d-ad3a-2394c1085368\") " Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.097221 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities" (OuterVolumeSpecName: "utilities") pod "ed4603e8-0f1b-423d-ad3a-2394c1085368" (UID: "ed4603e8-0f1b-423d-ad3a-2394c1085368"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.098034 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j" (OuterVolumeSpecName: "kube-api-access-4525j") pod "ed4603e8-0f1b-423d-ad3a-2394c1085368" (UID: "ed4603e8-0f1b-423d-ad3a-2394c1085368"). InnerVolumeSpecName "kube-api-access-4525j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.144657 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed4603e8-0f1b-423d-ad3a-2394c1085368" (UID: "ed4603e8-0f1b-423d-ad3a-2394c1085368"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.188002 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.188035 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed4603e8-0f1b-423d-ad3a-2394c1085368-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.188048 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4525j\" (UniqueName: \"kubernetes.io/projected/ed4603e8-0f1b-423d-ad3a-2394c1085368-kube-api-access-4525j\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.601082 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqlfx" event={"ID":"ed4603e8-0f1b-423d-ad3a-2394c1085368","Type":"ContainerDied","Data":"e7247c567106256438eb456f7d0e49831f68cc306112ef0a38de8ba956bda58a"} Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.601131 4895 scope.go:117] "RemoveContainer" containerID="6b3c0aed99c1c82a6d0ab588b183c9ad69f11a132153a83b6d8130d69a62e984" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.601251 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqlfx" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.625671 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.635865 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqlfx"] Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.640210 4895 scope.go:117] "RemoveContainer" containerID="97d190db3fa7b946835157aaaf205167f8dc9561c70a3b8beb2ebeb4b08276ab" Mar 20 14:52:47 crc kubenswrapper[4895]: I0320 14:52:47.662663 4895 scope.go:117] "RemoveContainer" containerID="70747841bc006888764b0f3ff3a09750cc6d281c9064a0b53140a3dcc7977608" Mar 20 14:52:49 crc kubenswrapper[4895]: I0320 14:52:49.229202 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" path="/var/lib/kubelet/pods/ed4603e8-0f1b-423d-ad3a-2394c1085368/volumes" Mar 20 14:53:37 crc kubenswrapper[4895]: I0320 14:53:37.096863 4895 generic.go:334] "Generic (PLEG): container finished" podID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerID="5dab94cf46dc749b640bbf5d6a22fc44f58db65f0132f6b03fd3e95b324b704f" exitCode=0 Mar 20 14:53:37 crc kubenswrapper[4895]: I0320 14:53:37.096955 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rrdnn/must-gather-ssw99" event={"ID":"f685eaf5-ecfb-4102-8d32-f200e5346700","Type":"ContainerDied","Data":"5dab94cf46dc749b640bbf5d6a22fc44f58db65f0132f6b03fd3e95b324b704f"} Mar 20 14:53:37 crc kubenswrapper[4895]: I0320 14:53:37.098061 4895 scope.go:117] "RemoveContainer" containerID="5dab94cf46dc749b640bbf5d6a22fc44f58db65f0132f6b03fd3e95b324b704f" Mar 20 14:53:37 crc kubenswrapper[4895]: I0320 14:53:37.999103 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rrdnn_must-gather-ssw99_f685eaf5-ecfb-4102-8d32-f200e5346700/gather/0.log" Mar 20 14:53:48 crc kubenswrapper[4895]: I0320 14:53:48.166884 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rrdnn/must-gather-ssw99"] Mar 20 14:53:48 crc kubenswrapper[4895]: I0320 14:53:48.167505 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rrdnn/must-gather-ssw99" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="copy" containerID="cri-o://5d7114f836d125b623b896e1877cf8d7e3c2ef38476b7e81e05e9d8a2001155a" gracePeriod=2 Mar 20 14:53:48 crc kubenswrapper[4895]: I0320 14:53:48.180560 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rrdnn/must-gather-ssw99"] Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.285118 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rrdnn_must-gather-ssw99_f685eaf5-ecfb-4102-8d32-f200e5346700/copy/0.log" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.292484 4895 generic.go:334] "Generic (PLEG): container finished" podID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerID="5d7114f836d125b623b896e1877cf8d7e3c2ef38476b7e81e05e9d8a2001155a" exitCode=143 Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.491147 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rrdnn_must-gather-ssw99_f685eaf5-ecfb-4102-8d32-f200e5346700/copy/0.log" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.491477 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.513981 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output\") pod \"f685eaf5-ecfb-4102-8d32-f200e5346700\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.514249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whq47\" (UniqueName: \"kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47\") pod \"f685eaf5-ecfb-4102-8d32-f200e5346700\" (UID: \"f685eaf5-ecfb-4102-8d32-f200e5346700\") " Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.531629 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47" (OuterVolumeSpecName: "kube-api-access-whq47") pod "f685eaf5-ecfb-4102-8d32-f200e5346700" (UID: "f685eaf5-ecfb-4102-8d32-f200e5346700"). InnerVolumeSpecName "kube-api-access-whq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.616379 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whq47\" (UniqueName: \"kubernetes.io/projected/f685eaf5-ecfb-4102-8d32-f200e5346700-kube-api-access-whq47\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.795222 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f685eaf5-ecfb-4102-8d32-f200e5346700" (UID: "f685eaf5-ecfb-4102-8d32-f200e5346700"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:53:49 crc kubenswrapper[4895]: I0320 14:53:49.822985 4895 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f685eaf5-ecfb-4102-8d32-f200e5346700-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:50 crc kubenswrapper[4895]: I0320 14:53:50.304198 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rrdnn_must-gather-ssw99_f685eaf5-ecfb-4102-8d32-f200e5346700/copy/0.log" Mar 20 14:53:50 crc kubenswrapper[4895]: I0320 14:53:50.304797 4895 scope.go:117] "RemoveContainer" containerID="5d7114f836d125b623b896e1877cf8d7e3c2ef38476b7e81e05e9d8a2001155a" Mar 20 14:53:50 crc kubenswrapper[4895]: I0320 14:53:50.304810 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rrdnn/must-gather-ssw99" Mar 20 14:53:50 crc kubenswrapper[4895]: I0320 14:53:50.328730 4895 scope.go:117] "RemoveContainer" containerID="5dab94cf46dc749b640bbf5d6a22fc44f58db65f0132f6b03fd3e95b324b704f" Mar 20 14:53:51 crc kubenswrapper[4895]: I0320 14:53:51.225735 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" path="/var/lib/kubelet/pods/f685eaf5-ecfb-4102-8d32-f200e5346700/volumes" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.168677 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566974-7jb5d"] Mar 20 14:54:00 crc kubenswrapper[4895]: E0320 14:54:00.169502 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="registry-server" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169513 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="registry-server" Mar 20 14:54:00 crc kubenswrapper[4895]: E0320 14:54:00.169523 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="extract-utilities" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169529 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="extract-utilities" Mar 20 14:54:00 crc kubenswrapper[4895]: E0320 14:54:00.169540 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="copy" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169546 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="copy" Mar 20 14:54:00 crc kubenswrapper[4895]: E0320 14:54:00.169571 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="extract-content" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169577 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="extract-content" Mar 20 14:54:00 crc kubenswrapper[4895]: E0320 14:54:00.169585 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="gather" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169590 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="gather" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169770 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4603e8-0f1b-423d-ad3a-2394c1085368" containerName="registry-server" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169791 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="copy" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.169804 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f685eaf5-ecfb-4102-8d32-f200e5346700" containerName="gather" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.170719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.174414 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.175116 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.175600 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.226186 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-7jb5d"] Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.229325 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjt8\" (UniqueName: \"kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8\") pod \"auto-csr-approver-29566974-7jb5d\" (UID: \"dece9cd0-b5cc-453e-8595-ca86a54230fb\") " pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.334058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjt8\" (UniqueName: \"kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8\") pod \"auto-csr-approver-29566974-7jb5d\" (UID: \"dece9cd0-b5cc-453e-8595-ca86a54230fb\") " pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.360430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjt8\" (UniqueName: \"kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8\") pod \"auto-csr-approver-29566974-7jb5d\" (UID: \"dece9cd0-b5cc-453e-8595-ca86a54230fb\") " pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:00 crc kubenswrapper[4895]: I0320 14:54:00.497176 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:01 crc kubenswrapper[4895]: I0320 14:54:01.264267 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-7jb5d"] Mar 20 14:54:01 crc kubenswrapper[4895]: I0320 14:54:01.410810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" event={"ID":"dece9cd0-b5cc-453e-8595-ca86a54230fb","Type":"ContainerStarted","Data":"4c36c27c53704a2896130fb75d89674234ceaac2b66f37d7321ed1bc8fbf3639"} Mar 20 14:54:03 crc kubenswrapper[4895]: I0320 14:54:03.429323 4895 generic.go:334] "Generic (PLEG): container finished" podID="dece9cd0-b5cc-453e-8595-ca86a54230fb" containerID="990fab436d274f110d0bd2b17042bb6600ef638587d304a212fd0e2bba1a837b" exitCode=0 Mar 20 14:54:03 crc kubenswrapper[4895]: I0320 14:54:03.429475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" event={"ID":"dece9cd0-b5cc-453e-8595-ca86a54230fb","Type":"ContainerDied","Data":"990fab436d274f110d0bd2b17042bb6600ef638587d304a212fd0e2bba1a837b"} Mar 20 14:54:05 crc kubenswrapper[4895]: I0320 14:54:05.657782 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:05 crc kubenswrapper[4895]: I0320 14:54:05.744473 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tjt8\" (UniqueName: \"kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8\") pod \"dece9cd0-b5cc-453e-8595-ca86a54230fb\" (UID: \"dece9cd0-b5cc-453e-8595-ca86a54230fb\") " Mar 20 14:54:05 crc kubenswrapper[4895]: I0320 14:54:05.764601 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8" (OuterVolumeSpecName: "kube-api-access-2tjt8") pod "dece9cd0-b5cc-453e-8595-ca86a54230fb" (UID: "dece9cd0-b5cc-453e-8595-ca86a54230fb"). InnerVolumeSpecName "kube-api-access-2tjt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:54:05 crc kubenswrapper[4895]: I0320 14:54:05.847735 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tjt8\" (UniqueName: \"kubernetes.io/projected/dece9cd0-b5cc-453e-8595-ca86a54230fb-kube-api-access-2tjt8\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:06 crc kubenswrapper[4895]: I0320 14:54:06.502862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" event={"ID":"dece9cd0-b5cc-453e-8595-ca86a54230fb","Type":"ContainerDied","Data":"4c36c27c53704a2896130fb75d89674234ceaac2b66f37d7321ed1bc8fbf3639"} Mar 20 14:54:06 crc kubenswrapper[4895]: I0320 14:54:06.503148 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c36c27c53704a2896130fb75d89674234ceaac2b66f37d7321ed1bc8fbf3639" Mar 20 14:54:06 crc kubenswrapper[4895]: I0320 14:54:06.503243 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-7jb5d" Mar 20 14:54:06 crc kubenswrapper[4895]: I0320 14:54:06.732985 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-vkp5t"] Mar 20 14:54:06 crc kubenswrapper[4895]: I0320 14:54:06.750734 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-vkp5t"] Mar 20 14:54:07 crc kubenswrapper[4895]: I0320 14:54:07.221921 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6d77fb-5b57-4c86-a555-358411dc9392" path="/var/lib/kubelet/pods/bd6d77fb-5b57-4c86-a555-358411dc9392/volumes" Mar 20 14:54:23 crc kubenswrapper[4895]: I0320 14:54:23.123813 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:54:23 crc kubenswrapper[4895]: I0320 14:54:23.124276 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:54:28 crc kubenswrapper[4895]: I0320 14:54:28.444845 4895 scope.go:117] "RemoveContainer" containerID="bec683f63221d09824bf5c2847e5cba9d7ef3a49a2205279cc99e8ff7f324709" Mar 20 14:54:52 crc kubenswrapper[4895]: I0320 14:54:52.310926 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:54:52 crc kubenswrapper[4895]: I0320 14:54:52.311450 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.296862 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.297438 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.297495 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.298297 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.298353 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f" gracePeriod=600 Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.782737 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f" exitCode=0 Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.782814 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f"} Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.783352 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerStarted","Data":"398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38"} Mar 20 14:55:22 crc kubenswrapper[4895]: I0320 14:55:22.783380 4895 scope.go:117] "RemoveContainer" containerID="a497fb25f346e4fd354c8b9e43da1ebd3518885eb76458f45cb22b1f1988b5a5" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.179155 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566976-ttz9m"] Mar 20 14:56:00 crc kubenswrapper[4895]: E0320 14:56:00.180707 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dece9cd0-b5cc-453e-8595-ca86a54230fb" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.180729 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dece9cd0-b5cc-453e-8595-ca86a54230fb" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.181203 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dece9cd0-b5cc-453e-8595-ca86a54230fb" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.183940 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.196726 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.197003 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.197137 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.249293 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-ttz9m"] Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.296641 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmfg\" (UniqueName: \"kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg\") pod \"auto-csr-approver-29566976-ttz9m\" (UID: \"51f86843-df7b-4527-83ef-c2a0f2e32737\") " pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.401659 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmfg\" (UniqueName: \"kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg\") pod \"auto-csr-approver-29566976-ttz9m\" (UID: \"51f86843-df7b-4527-83ef-c2a0f2e32737\") " pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.424577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmfg\" (UniqueName: \"kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg\") pod \"auto-csr-approver-29566976-ttz9m\" (UID: \"51f86843-df7b-4527-83ef-c2a0f2e32737\") " pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:00 crc kubenswrapper[4895]: I0320 14:56:00.555410 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:01 crc kubenswrapper[4895]: I0320 14:56:01.345839 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-ttz9m"] Mar 20 14:56:01 crc kubenswrapper[4895]: I0320 14:56:01.354138 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:56:02 crc kubenswrapper[4895]: I0320 14:56:02.176298 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" event={"ID":"51f86843-df7b-4527-83ef-c2a0f2e32737","Type":"ContainerStarted","Data":"5b0c8befdbfa587e23e7128fbcbadecd8afb4c968839f9c1020b35c34fbd9cca"} Mar 20 14:56:03 crc kubenswrapper[4895]: I0320 14:56:03.195287 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" event={"ID":"51f86843-df7b-4527-83ef-c2a0f2e32737","Type":"ContainerStarted","Data":"9c8b6f6ce10427bd4d718b63c3c7e02bf5d04b6d41c5896a53a52977668ea371"} Mar 20 14:56:04 crc kubenswrapper[4895]: I0320 14:56:04.213692 4895 generic.go:334] "Generic (PLEG): container finished" podID="51f86843-df7b-4527-83ef-c2a0f2e32737" containerID="9c8b6f6ce10427bd4d718b63c3c7e02bf5d04b6d41c5896a53a52977668ea371" exitCode=0 Mar 20 14:56:04 crc kubenswrapper[4895]: I0320 14:56:04.213747 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" event={"ID":"51f86843-df7b-4527-83ef-c2a0f2e32737","Type":"ContainerDied","Data":"9c8b6f6ce10427bd4d718b63c3c7e02bf5d04b6d41c5896a53a52977668ea371"} Mar 20 14:56:06 crc kubenswrapper[4895]: I0320 14:56:06.637673 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:06 crc kubenswrapper[4895]: I0320 14:56:06.745170 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmmfg\" (UniqueName: \"kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg\") pod \"51f86843-df7b-4527-83ef-c2a0f2e32737\" (UID: \"51f86843-df7b-4527-83ef-c2a0f2e32737\") " Mar 20 14:56:06 crc kubenswrapper[4895]: I0320 14:56:06.758253 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg" (OuterVolumeSpecName: "kube-api-access-bmmfg") pod "51f86843-df7b-4527-83ef-c2a0f2e32737" (UID: "51f86843-df7b-4527-83ef-c2a0f2e32737"). InnerVolumeSpecName "kube-api-access-bmmfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:06 crc kubenswrapper[4895]: I0320 14:56:06.848416 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmmfg\" (UniqueName: \"kubernetes.io/projected/51f86843-df7b-4527-83ef-c2a0f2e32737-kube-api-access-bmmfg\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:07 crc kubenswrapper[4895]: I0320 14:56:07.242187 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" event={"ID":"51f86843-df7b-4527-83ef-c2a0f2e32737","Type":"ContainerDied","Data":"5b0c8befdbfa587e23e7128fbcbadecd8afb4c968839f9c1020b35c34fbd9cca"} Mar 20 14:56:07 crc kubenswrapper[4895]: I0320 14:56:07.242246 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0c8befdbfa587e23e7128fbcbadecd8afb4c968839f9c1020b35c34fbd9cca" Mar 20 14:56:07 crc kubenswrapper[4895]: I0320 14:56:07.242246 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-ttz9m" Mar 20 14:56:07 crc kubenswrapper[4895]: I0320 14:56:07.757161 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-zssq9"] Mar 20 14:56:07 crc kubenswrapper[4895]: I0320 14:56:07.771021 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-zssq9"] Mar 20 14:56:09 crc kubenswrapper[4895]: I0320 14:56:09.309095 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4cc896-130e-48d3-86c3-c56c202cd7e8" path="/var/lib/kubelet/pods/ed4cc896-130e-48d3-86c3-c56c202cd7e8/volumes" Mar 20 14:56:28 crc kubenswrapper[4895]: I0320 14:56:28.577164 4895 scope.go:117] "RemoveContainer" containerID="53a1be03390d3192e2aab06b0c8dccd3382fd3db581451f841ddefb08b658da7" Mar 20 14:57:22 crc kubenswrapper[4895]: I0320 14:57:22.297865 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:57:22 crc kubenswrapper[4895]: I0320 14:57:22.298316 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:57:52 crc kubenswrapper[4895]: I0320 14:57:52.296852 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:57:52 crc kubenswrapper[4895]: I0320 14:57:52.297406 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.139744 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566978-hpm7k"] Mar 20 14:58:00 crc kubenswrapper[4895]: E0320 14:58:00.142018 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f86843-df7b-4527-83ef-c2a0f2e32737" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.142046 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f86843-df7b-4527-83ef-c2a0f2e32737" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.142235 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f86843-df7b-4527-83ef-c2a0f2e32737" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.143019 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.144776 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.144831 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.145239 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-84lcq" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.155470 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-hpm7k"] Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.328141 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltc8\" (UniqueName: \"kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8\") pod \"auto-csr-approver-29566978-hpm7k\" (UID: \"ed43a0bd-8c6a-47ac-92dd-52209df472c9\") " pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.429562 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltc8\" (UniqueName: \"kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8\") pod \"auto-csr-approver-29566978-hpm7k\" (UID: \"ed43a0bd-8c6a-47ac-92dd-52209df472c9\") " pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.452668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltc8\" (UniqueName: \"kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8\") pod \"auto-csr-approver-29566978-hpm7k\" (UID: \"ed43a0bd-8c6a-47ac-92dd-52209df472c9\") " pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:00 crc kubenswrapper[4895]: I0320 14:58:00.513806 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:01 crc kubenswrapper[4895]: I0320 14:58:01.308849 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-hpm7k"] Mar 20 14:58:01 crc kubenswrapper[4895]: W0320 14:58:01.310446 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded43a0bd_8c6a_47ac_92dd_52209df472c9.slice/crio-b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f WatchSource:0}: Error finding container b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f: Status 404 returned error can't find the container with id b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f Mar 20 14:58:01 crc kubenswrapper[4895]: I0320 14:58:01.342401 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" event={"ID":"ed43a0bd-8c6a-47ac-92dd-52209df472c9","Type":"ContainerStarted","Data":"b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f"} Mar 20 14:58:04 crc kubenswrapper[4895]: I0320 14:58:04.384179 4895 generic.go:334] "Generic (PLEG): container finished" podID="ed43a0bd-8c6a-47ac-92dd-52209df472c9" containerID="05ff09de89de43aa2e2df4ba748049941fd5c8ef40cca5600d3120fdff82ab2d" exitCode=0 Mar 20 14:58:04 crc kubenswrapper[4895]: I0320 14:58:04.384240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" event={"ID":"ed43a0bd-8c6a-47ac-92dd-52209df472c9","Type":"ContainerDied","Data":"05ff09de89de43aa2e2df4ba748049941fd5c8ef40cca5600d3120fdff82ab2d"} Mar 20 14:58:06 crc kubenswrapper[4895]: I0320 14:58:06.671441 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:06 crc kubenswrapper[4895]: I0320 14:58:06.808955 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kltc8\" (UniqueName: \"kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8\") pod \"ed43a0bd-8c6a-47ac-92dd-52209df472c9\" (UID: \"ed43a0bd-8c6a-47ac-92dd-52209df472c9\") " Mar 20 14:58:06 crc kubenswrapper[4895]: I0320 14:58:06.816649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8" (OuterVolumeSpecName: "kube-api-access-kltc8") pod "ed43a0bd-8c6a-47ac-92dd-52209df472c9" (UID: "ed43a0bd-8c6a-47ac-92dd-52209df472c9"). InnerVolumeSpecName "kube-api-access-kltc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:06 crc kubenswrapper[4895]: I0320 14:58:06.911589 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kltc8\" (UniqueName: \"kubernetes.io/projected/ed43a0bd-8c6a-47ac-92dd-52209df472c9-kube-api-access-kltc8\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:07 crc kubenswrapper[4895]: I0320 14:58:07.418544 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" event={"ID":"ed43a0bd-8c6a-47ac-92dd-52209df472c9","Type":"ContainerDied","Data":"b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f"} Mar 20 14:58:07 crc kubenswrapper[4895]: I0320 14:58:07.418582 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1f91bdf2c63b032c0f2d8a931e028e2a7e178ce963611de0311cda92caf290f" Mar 20 14:58:07 crc kubenswrapper[4895]: I0320 14:58:07.418587 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-hpm7k" Mar 20 14:58:07 crc kubenswrapper[4895]: I0320 14:58:07.758268 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-jbc9n"] Mar 20 14:58:07 crc kubenswrapper[4895]: I0320 14:58:07.770131 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-jbc9n"] Mar 20 14:58:09 crc kubenswrapper[4895]: I0320 14:58:09.223991 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdebf311-01b8-4474-b598-4458290a0100" path="/var/lib/kubelet/pods/bdebf311-01b8-4474-b598-4458290a0100/volumes" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.297102 4895 patch_prober.go:28] interesting pod/machine-config-daemon-w9jrr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.297718 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.297768 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.298745 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38"} pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.298826 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerName="machine-config-daemon" containerID="cri-o://398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38" gracePeriod=600 Mar 20 14:58:22 crc kubenswrapper[4895]: E0320 14:58:22.422167 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.586556 4895 generic.go:334] "Generic (PLEG): container finished" podID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" containerID="398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38" exitCode=0 Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.586602 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" event={"ID":"6e9e3134-0fea-4e77-a1e4-e74835ee41e8","Type":"ContainerDied","Data":"398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38"} Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.586633 4895 scope.go:117] "RemoveContainer" containerID="1368737b95d8198731c3699e169fb3f75430de5a23267452d7cbbcc0be884a4f" Mar 20 14:58:22 crc kubenswrapper[4895]: I0320 14:58:22.587325 4895 scope.go:117] "RemoveContainer" containerID="398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38" Mar 20 14:58:22 crc kubenswrapper[4895]: E0320 14:58:22.587609 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8" Mar 20 14:58:28 crc kubenswrapper[4895]: I0320 14:58:28.750454 4895 scope.go:117] "RemoveContainer" containerID="17a145f12d7b8c410c521ca54ec98b8622d106255fb9f384421e3962af7982d4" Mar 20 14:58:33 crc kubenswrapper[4895]: I0320 14:58:33.212546 4895 scope.go:117] "RemoveContainer" containerID="398e69be011eab7357756b424a78b2744d7412561e92785cd061e4f0dc068f38" Mar 20 14:58:33 crc kubenswrapper[4895]: E0320 14:58:33.213419 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w9jrr_openshift-machine-config-operator(6e9e3134-0fea-4e77-a1e4-e74835ee41e8)\"" pod="openshift-machine-config-operator/machine-config-daemon-w9jrr" podUID="6e9e3134-0fea-4e77-a1e4-e74835ee41e8"